Apr 17 17:24:19.217539 ip-10-0-133-87 systemd[1]: Starting Kubernetes Kubelet... Apr 17 17:24:19.687051 ip-10-0-133-87 kubenswrapper[2565]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:19.687051 ip-10-0-133-87 kubenswrapper[2565]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 17 17:24:19.687051 ip-10-0-133-87 kubenswrapper[2565]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:19.687051 ip-10-0-133-87 kubenswrapper[2565]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 17 17:24:19.687051 ip-10-0-133-87 kubenswrapper[2565]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 17 17:24:19.690114 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.690021 2565 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 17 17:24:19.695963 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695939 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:19.695963 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695959 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:19.695963 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695963 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:19.695963 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695966 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:19.695963 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695970 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695973 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695976 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695979 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695983 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695986 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695989 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695991 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695994 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.695997 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696000 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696002 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696005 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696007 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696010 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696012 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696015 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696021 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696023 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696026 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:19.696157 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696029 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696032 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696034 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696037 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696041 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696045 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696048 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696051 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696054 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696056 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696059 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696062 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696064 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696067 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696070 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696073 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696076 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696079 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696081 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:19.696646 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696084 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696086 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696089 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696092 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696095 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696097 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696100 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696102 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696105 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696108 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696110 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696113 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696116 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696120 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696123 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696125 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696128 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696130 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696133 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696135 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:19.697109 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696138 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696141 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696143 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696146 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696148 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696151 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696154 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696158 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696162 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696165 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696169 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696173 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696177 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696180 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696182 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696186 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696188 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696191 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696194 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:19.697641 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696197 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696199 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696202 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696205 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696633 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696639 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696642 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696645 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696647 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696651 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696655 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696657 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696660 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696663 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696665 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696668 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696671 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696673 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696676 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696678 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:19.698090 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696681 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696684 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696687 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696690 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696692 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696695 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696698 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696701 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696704 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696707 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696710 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696712 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696715 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696717 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696720 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696722 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696725 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696728 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696730 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696733 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:19.698582 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696735 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696737 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696740 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696742 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696745 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696747 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696750 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696753 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696755 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696757 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696760 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696763 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696765 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696768 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696772 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696774 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696776 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696779 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696782 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696785 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:19.699068 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696788 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696793 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696797 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696808 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696812 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696815 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696818 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696820 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696824 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696827 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696829 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696832 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696835 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696838 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696840 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696843 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696845 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696848 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696850 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696853 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:19.699574 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696856 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696858 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696862 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696865 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696868 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696870 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696873 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696875 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696878 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.696881 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698441 2565 flags.go:64] FLAG: --address="0.0.0.0" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698455 2565 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698463 2565 flags.go:64] FLAG: --anonymous-auth="true" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698468 2565 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698473 2565 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698477 2565 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698481 2565 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698486 2565 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698490 2565 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698493 2565 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698496 2565 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 17 17:24:19.700058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698500 2565 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698504 2565 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698507 2565 flags.go:64] FLAG: --cgroup-root="" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698510 2565 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698513 2565 flags.go:64] FLAG: --client-ca-file="" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698531 2565 flags.go:64] FLAG: --cloud-config="" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698535 2565 flags.go:64] FLAG: --cloud-provider="external" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698538 2565 flags.go:64] FLAG: --cluster-dns="[]" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698543 2565 flags.go:64] FLAG: --cluster-domain="" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698546 2565 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698550 2565 flags.go:64] FLAG: --config-dir="" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698553 2565 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698557 2565 flags.go:64] FLAG: --container-log-max-files="5" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698561 2565 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698564 2565 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698567 2565 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698570 2565 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698573 2565 flags.go:64] FLAG: --contention-profiling="false" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698576 2565 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698579 2565 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698583 2565 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698586 2565 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698591 2565 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698594 2565 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698597 2565 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 17 17:24:19.700574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698600 2565 flags.go:64] FLAG: --enable-load-reader="false" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698606 2565 flags.go:64] FLAG: --enable-server="true" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698609 2565 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698614 2565 flags.go:64] FLAG: --event-burst="100" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698618 2565 flags.go:64] FLAG: --event-qps="50" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698621 2565 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698625 2565 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698628 2565 flags.go:64] FLAG: --eviction-hard="" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698632 2565 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698635 2565 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698639 2565 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698643 2565 flags.go:64] FLAG: --eviction-soft="" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698646 2565 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698649 2565 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698652 2565 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698655 2565 flags.go:64] FLAG: --experimental-mounter-path="" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698658 2565 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698660 2565 flags.go:64] FLAG: --fail-swap-on="true" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698663 2565 flags.go:64] FLAG: --feature-gates="" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698667 2565 flags.go:64] FLAG: --file-check-frequency="20s" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698671 2565 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698674 2565 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698677 2565 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698680 2565 flags.go:64] FLAG: --healthz-port="10248" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698683 2565 flags.go:64] FLAG: --help="false" Apr 17 17:24:19.701180 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698687 2565 flags.go:64] FLAG: --hostname-override="ip-10-0-133-87.ec2.internal" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698690 2565 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698693 2565 flags.go:64] FLAG: --http-check-frequency="20s" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698695 2565 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698699 2565 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698702 2565 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698705 2565 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698708 2565 flags.go:64] FLAG: --image-service-endpoint="" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698710 2565 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698714 2565 flags.go:64] FLAG: --kube-api-burst="100" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698717 2565 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698721 2565 flags.go:64] FLAG: --kube-api-qps="50" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698723 2565 flags.go:64] FLAG: --kube-reserved="" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698726 2565 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698729 2565 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698733 2565 flags.go:64] FLAG: --kubelet-cgroups="" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698736 2565 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698739 2565 flags.go:64] FLAG: --lock-file="" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698742 2565 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698745 2565 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698748 2565 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698753 2565 flags.go:64] FLAG: --log-json-split-stream="false" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698756 2565 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698759 2565 flags.go:64] FLAG: --log-text-split-stream="false" Apr 17 17:24:19.701770 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698762 2565 flags.go:64] FLAG: --logging-format="text" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698765 2565 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698769 2565 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698772 2565 flags.go:64] FLAG: --manifest-url="" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698775 2565 flags.go:64] FLAG: --manifest-url-header="" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698780 2565 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698783 2565 flags.go:64] FLAG: --max-open-files="1000000" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698787 2565 flags.go:64] FLAG: --max-pods="110" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698790 2565 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698793 2565 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698796 2565 flags.go:64] FLAG: --memory-manager-policy="None" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698799 2565 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698802 2565 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698809 2565 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698812 2565 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698821 2565 flags.go:64] FLAG: --node-status-max-images="50" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698824 2565 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698827 2565 flags.go:64] FLAG: --oom-score-adj="-999" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698831 2565 flags.go:64] FLAG: --pod-cidr="" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698834 2565 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698840 2565 flags.go:64] FLAG: --pod-manifest-path="" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698842 2565 flags.go:64] FLAG: --pod-max-pids="-1" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698846 2565 flags.go:64] FLAG: --pods-per-core="0" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698849 2565 flags.go:64] FLAG: --port="10250" Apr 17 17:24:19.702407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698852 2565 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698855 2565 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0c0971fff6f45a70c" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698859 2565 flags.go:64] FLAG: --qos-reserved="" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698862 2565 flags.go:64] FLAG: --read-only-port="10255" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698865 2565 flags.go:64] FLAG: --register-node="true" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698868 2565 flags.go:64] FLAG: --register-schedulable="true" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698871 2565 flags.go:64] FLAG: --register-with-taints="" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698874 2565 flags.go:64] FLAG: --registry-burst="10" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698877 2565 flags.go:64] FLAG: --registry-qps="5" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698880 2565 flags.go:64] FLAG: --reserved-cpus="" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698883 2565 flags.go:64] FLAG: --reserved-memory="" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698887 2565 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698890 2565 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698893 2565 flags.go:64] FLAG: --rotate-certificates="false" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698896 2565 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698899 2565 flags.go:64] FLAG: --runonce="false" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698902 2565 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698905 2565 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698908 2565 flags.go:64] FLAG: --seccomp-default="false" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698911 2565 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698914 2565 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698916 2565 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698921 2565 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698924 2565 flags.go:64] FLAG: --storage-driver-password="root" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698927 2565 flags.go:64] FLAG: --storage-driver-secure="false" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698930 2565 flags.go:64] FLAG: --storage-driver-table="stats" Apr 17 17:24:19.702984 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698933 2565 flags.go:64] FLAG: --storage-driver-user="root" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698936 2565 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698939 2565 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698942 2565 flags.go:64] FLAG: --system-cgroups="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698949 2565 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698955 2565 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698958 2565 flags.go:64] FLAG: --tls-cert-file="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698961 2565 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698965 2565 flags.go:64] FLAG: --tls-min-version="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698968 2565 flags.go:64] FLAG: --tls-private-key-file="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698971 2565 flags.go:64] FLAG: --topology-manager-policy="none" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698974 2565 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698977 2565 flags.go:64] FLAG: --topology-manager-scope="container" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698980 2565 flags.go:64] FLAG: --v="2" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698984 2565 flags.go:64] FLAG: --version="false" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698988 2565 flags.go:64] FLAG: --vmodule="" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698993 2565 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.698996 2565 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699108 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699112 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699115 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699119 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699121 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699124 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:19.703619 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699127 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699130 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699132 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699135 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699138 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699141 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699144 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699147 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699149 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699153 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699163 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699168 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699172 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699175 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699178 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699181 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699184 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699186 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699189 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:19.704189 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699192 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699194 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699197 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699199 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699202 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699205 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699207 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699210 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699212 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699215 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699218 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699220 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699223 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699225 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699228 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699230 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699233 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699235 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699238 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699254 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:19.704677 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699256 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699259 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699262 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699264 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699267 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699271 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699274 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699277 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699279 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699282 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699284 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699287 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699290 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699292 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699295 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699298 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699300 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699303 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699305 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699308 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:19.705186 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699310 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699313 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699315 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699318 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699320 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699323 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699325 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699328 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699330 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699333 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699336 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699338 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699341 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699343 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699346 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699348 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699351 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699358 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699361 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699363 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:19.705719 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.699366 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:19.706204 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.701561 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:19.709565 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.709543 2565 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 17 17:24:19.709565 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.709566 2565 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709623 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709629 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709632 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709635 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709638 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709641 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709644 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:19.709643 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709648 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709651 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709654 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709657 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709660 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709662 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709665 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709667 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709670 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709672 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709675 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709678 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709681 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709683 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709686 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709688 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709691 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709693 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709696 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709699 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:19.709846 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709701 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709704 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709706 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709709 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709712 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709715 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709718 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709721 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709724 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709726 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709729 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709731 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709734 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709736 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709739 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709742 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709744 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709747 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709749 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:19.710414 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709752 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709755 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709757 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709760 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709763 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709765 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709767 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709770 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709774 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709778 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709781 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709784 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709786 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709789 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709791 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709793 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709796 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709798 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709801 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:19.710877 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709804 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709807 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709810 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709812 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709815 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709818 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709821 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709823 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709826 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709829 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709831 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709834 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709836 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709839 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709841 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709844 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709846 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709849 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709853 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:19.711368 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709858 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709861 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.709866 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709977 2565 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709982 2565 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709985 2565 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709988 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709991 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709993 2565 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709996 2565 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.709999 2565 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710001 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710004 2565 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710014 2565 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710017 2565 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710020 2565 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 17 17:24:19.711840 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710023 2565 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710026 2565 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710030 2565 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710034 2565 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710037 2565 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710040 2565 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710044 2565 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710047 2565 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710049 2565 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710052 2565 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710055 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710058 2565 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710060 2565 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710063 2565 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710066 2565 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710068 2565 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710071 2565 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710073 2565 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710076 2565 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 17 17:24:19.712229 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710078 2565 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710081 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710083 2565 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710086 2565 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710088 2565 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710091 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710093 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710096 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710099 2565 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710101 2565 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710104 2565 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710108 2565 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710110 2565 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710113 2565 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710116 2565 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710118 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710121 2565 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710123 2565 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710125 2565 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 17 17:24:19.712704 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710128 2565 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710131 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710134 2565 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710137 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710139 2565 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710142 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710144 2565 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710147 2565 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710149 2565 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710152 2565 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710154 2565 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710157 2565 feature_gate.go:328] unrecognized feature gate: Example Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710159 2565 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710161 2565 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710164 2565 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710167 2565 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710169 2565 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710172 2565 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710175 2565 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710177 2565 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 17 17:24:19.713196 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710180 2565 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710182 2565 feature_gate.go:328] unrecognized feature gate: Example2 Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710185 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710188 2565 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710192 2565 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710195 2565 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710198 2565 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710201 2565 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710203 2565 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710206 2565 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710208 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710211 2565 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710214 2565 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710216 2565 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:19.710219 2565 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 17 17:24:19.713842 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.710224 2565 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 17 17:24:19.714226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.711008 2565 server.go:962] "Client rotation is on, will bootstrap in background" Apr 17 17:24:19.714226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.713199 2565 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 17 17:24:19.714226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.714101 2565 server.go:1019] "Starting client certificate rotation" Apr 17 17:24:19.714226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.714198 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:19.714356 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.714232 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 17 17:24:19.740986 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.740962 2565 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:19.746016 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.745601 2565 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 17 17:24:19.759292 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.759266 2565 log.go:25] "Validated CRI v1 runtime API" Apr 17 17:24:19.765117 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.765098 2565 log.go:25] "Validated CRI v1 image API" Apr 17 17:24:19.766800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.766783 2565 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 17 17:24:19.771950 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.771924 2565 fs.go:135] Filesystem UUIDs: map[126ec67f-bbac-49c9-9ba8-a6a9a1c83093:/dev/nvme0n1p4 597e2a13-4855-4da8-984a-4a133c88c204:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2] Apr 17 17:24:19.772005 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.771951 2565 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 17 17:24:19.772862 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.772834 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:19.777562 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.777447 2565 manager.go:217] Machine: {Timestamp:2026-04-17 17:24:19.775701602 +0000 UTC m=+0.433745900 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3097258 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec2ca47e5b78a72f7a1653d249513061 SystemUUID:ec2ca47e-5b78-a72f-7a16-53d249513061 BootID:4c7839a0-0dc6-486b-bce6-a7d2c128c200 Filesystems:[{Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:6e:b2:c0:d5:63 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:6e:b2:c0:d5:63 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:1a:05:cf:9c:a7:a3 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 17 17:24:19.777562 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.777560 2565 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 17 17:24:19.777666 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.777639 2565 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 17 17:24:19.777979 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.777956 2565 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 17 17:24:19.778141 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.777981 2565 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-87.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 17 17:24:19.778183 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.778150 2565 topology_manager.go:138] "Creating topology manager with none policy" Apr 17 17:24:19.778183 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.778159 2565 container_manager_linux.go:306] "Creating device plugin manager" Apr 17 17:24:19.778183 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.778172 2565 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:19.779139 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.779129 2565 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 17 17:24:19.780654 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.780643 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:19.780764 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.780755 2565 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 17 17:24:19.783008 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.782987 2565 kubelet.go:491] "Attempting to sync node with API server" Apr 17 17:24:19.783008 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.783012 2565 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 17 17:24:19.783087 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.783025 2565 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 17 17:24:19.783087 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.783034 2565 kubelet.go:397] "Adding apiserver pod source" Apr 17 17:24:19.783087 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.783044 2565 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 17 17:24:19.784050 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.784036 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:19.784117 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.784055 2565 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 17 17:24:19.786901 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.786885 2565 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 17 17:24:19.788262 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.788233 2565 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 17 17:24:19.790128 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790116 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 17 17:24:19.790128 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790133 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790140 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790146 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790151 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790157 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790163 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790170 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790177 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790192 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790207 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 17 17:24:19.790219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.790215 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 17 17:24:19.791116 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.791105 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 17 17:24:19.791116 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.791116 2565 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 17 17:24:19.792565 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.792533 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 17 17:24:19.792653 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.792626 2565 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-87.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 17 17:24:19.794143 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.794126 2565 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-87.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 17 17:24:19.794768 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.794755 2565 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 17 17:24:19.794855 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.794797 2565 server.go:1295] "Started kubelet" Apr 17 17:24:19.794903 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.794850 2565 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 17 17:24:19.794947 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.794905 2565 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 17 17:24:19.796635 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.796608 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k78z" Apr 17 17:24:19.797491 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.797469 2565 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 17 17:24:19.798193 ip-10-0-133-87 systemd[1]: Started Kubernetes Kubelet. Apr 17 17:24:19.803983 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.803962 2565 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 17 17:24:19.804310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.804296 2565 server.go:317] "Adding debug handlers to kubelet server" Apr 17 17:24:19.804501 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.804485 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-9k78z" Apr 17 17:24:19.806788 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.805714 2565 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-87.ec2.internal.18a734d3377f1315 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-87.ec2.internal,UID:ip-10-0-133-87.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-87.ec2.internal,},FirstTimestamp:2026-04-17 17:24:19.794768661 +0000 UTC m=+0.452812959,LastTimestamp:2026-04-17 17:24:19.794768661 +0000 UTC m=+0.452812959,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-87.ec2.internal,}" Apr 17 17:24:19.808385 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.808354 2565 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 17 17:24:19.809341 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.809322 2565 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 17 17:24:19.809431 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.809348 2565 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:19.810054 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.809993 2565 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 17 17:24:19.810054 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.809997 2565 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 17 17:24:19.810054 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810022 2565 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 17 17:24:19.810227 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810181 2565 reconstruct.go:97] "Volume reconstruction finished" Apr 17 17:24:19.810227 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810192 2565 reconciler.go:26] "Reconciler: start to sync state" Apr 17 17:24:19.810371 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810355 2565 factory.go:153] Registering CRI-O factory Apr 17 17:24:19.810436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810426 2565 factory.go:223] Registration of the crio container factory successfully Apr 17 17:24:19.810494 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810483 2565 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 17 17:24:19.810540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810497 2565 factory.go:55] Registering systemd factory Apr 17 17:24:19.810540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810506 2565 factory.go:223] Registration of the systemd container factory successfully Apr 17 17:24:19.810540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810529 2565 factory.go:103] Registering Raw factory Apr 17 17:24:19.810683 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810543 2565 manager.go:1196] Started watching for new ooms in manager Apr 17 17:24:19.810736 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.810716 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:19.810958 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.810934 2565 manager.go:319] Starting recovery of all containers Apr 17 17:24:19.811555 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.811372 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:19.814071 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.814045 2565 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-133-87.ec2.internal\" not found" node="ip-10-0-133-87.ec2.internal" Apr 17 17:24:19.820625 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.820607 2565 manager.go:324] Recovery completed Apr 17 17:24:19.825007 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.824994 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:19.827560 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.827544 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:19.827624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.827594 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:19.827624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.827607 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:19.828074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.828057 2565 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 17 17:24:19.828074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.828070 2565 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 17 17:24:19.828159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.828087 2565 state_mem.go:36] "Initialized new in-memory state store" Apr 17 17:24:19.830577 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.830565 2565 policy_none.go:49] "None policy: Start" Apr 17 17:24:19.830624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.830581 2565 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 17 17:24:19.830624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.830591 2565 state_mem.go:35] "Initializing new in-memory state store" Apr 17 17:24:19.879827 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.879811 2565 manager.go:341] "Starting Device Plugin manager" Apr 17 17:24:19.879929 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.879862 2565 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 17 17:24:19.879929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.879876 2565 server.go:85] "Starting device plugin registration server" Apr 17 17:24:19.880167 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.880156 2565 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 17 17:24:19.880202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.880170 2565 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 17 17:24:19.880301 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.880285 2565 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 17 17:24:19.880410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.880385 2565 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 17 17:24:19.880410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.880396 2565 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 17 17:24:19.880961 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.880941 2565 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 17 17:24:19.881043 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.880981 2565 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:19.945411 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.945323 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 17 17:24:19.946651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.946631 2565 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 17 17:24:19.946708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.946667 2565 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 17 17:24:19.946708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.946695 2565 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 17 17:24:19.946708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.946707 2565 kubelet.go:2451] "Starting kubelet main sync loop" Apr 17 17:24:19.946806 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.946748 2565 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 17 17:24:19.949774 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.949750 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:19.980550 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.980510 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:19.981718 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.981700 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:19.981800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.981731 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:19.981800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.981741 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:19.981800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.981765 2565 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-87.ec2.internal" Apr 17 17:24:19.990292 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:19.990273 2565 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-87.ec2.internal" Apr 17 17:24:19.990391 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:19.990299 2565 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-87.ec2.internal\": node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.006444 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.006420 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.046980 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.046940 2565 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal"] Apr 17 17:24:20.047073 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.047028 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:20.048016 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.047999 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:20.048096 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.048027 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:20.048096 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.048036 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:20.049198 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.049186 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:20.049375 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.049360 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.049435 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.049390 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:20.049971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.049954 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:20.050049 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.049988 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:20.050049 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.050003 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:20.050049 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.049958 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:20.050185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.050050 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:20.050185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.050062 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:20.051350 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.051337 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.051398 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.051363 2565 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 17 17:24:20.052089 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.052073 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientMemory" Apr 17 17:24:20.052143 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.052106 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasNoDiskPressure" Apr 17 17:24:20.052143 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.052121 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeHasSufficientPID" Apr 17 17:24:20.076893 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.076873 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-87.ec2.internal\" not found" node="ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.081142 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.081127 2565 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-87.ec2.internal\" not found" node="ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.107272 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.107236 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.112594 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.112577 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8cf3e3e8476c10b85dc36d1342e54de1-config\") pod \"kube-apiserver-proxy-ip-10-0-133-87.ec2.internal\" (UID: \"8cf3e3e8476c10b85dc36d1342e54de1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.112663 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.112603 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51db773c13b85f7ab11509c484c019d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal\" (UID: \"51db773c13b85f7ab11509c484c019d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.112663 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.112620 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51db773c13b85f7ab11509c484c019d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal\" (UID: \"51db773c13b85f7ab11509c484c019d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.208280 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.208193 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.213562 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.213539 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8cf3e3e8476c10b85dc36d1342e54de1-config\") pod \"kube-apiserver-proxy-ip-10-0-133-87.ec2.internal\" (UID: \"8cf3e3e8476c10b85dc36d1342e54de1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.213620 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.213574 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51db773c13b85f7ab11509c484c019d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal\" (UID: \"51db773c13b85f7ab11509c484c019d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.213620 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.213596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51db773c13b85f7ab11509c484c019d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal\" (UID: \"51db773c13b85f7ab11509c484c019d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.213711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.213645 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/8cf3e3e8476c10b85dc36d1342e54de1-config\") pod \"kube-apiserver-proxy-ip-10-0-133-87.ec2.internal\" (UID: \"8cf3e3e8476c10b85dc36d1342e54de1\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.213711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.213691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/51db773c13b85f7ab11509c484c019d3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal\" (UID: \"51db773c13b85f7ab11509c484c019d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.213792 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.213645 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/51db773c13b85f7ab11509c484c019d3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal\" (UID: \"51db773c13b85f7ab11509c484c019d3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.308745 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.308703 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.379205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.379176 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.384082 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.384059 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" Apr 17 17:24:20.408899 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.408865 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.510015 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.509919 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.610396 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.610359 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.710829 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.710784 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.713994 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.713972 2565 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 17 17:24:20.714354 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.714142 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:20.714354 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.714209 2565 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 17 17:24:20.806751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.806715 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-16 17:19:19 +0000 UTC" deadline="2027-11-03 02:12:02.031100013 +0000 UTC" Apr 17 17:24:20.806751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.806748 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13544h47m41.224356001s" Apr 17 17:24:20.809780 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.809754 2565 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 17 17:24:20.810875 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.810850 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.821601 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.821582 2565 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 17 17:24:20.845538 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.845505 2565 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-gnxr6" Apr 17 17:24:20.854290 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.854263 2565 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-gnxr6" Apr 17 17:24:20.891633 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:20.891590 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8cf3e3e8476c10b85dc36d1342e54de1.slice/crio-6d49eb364fa240ac54905cffbced6c10e0e9ca8ad89f206fde7fcc286d93213b WatchSource:0}: Error finding container 6d49eb364fa240ac54905cffbced6c10e0e9ca8ad89f206fde7fcc286d93213b: Status 404 returned error can't find the container with id 6d49eb364fa240ac54905cffbced6c10e0e9ca8ad89f206fde7fcc286d93213b Apr 17 17:24:20.891869 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:20.891847 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod51db773c13b85f7ab11509c484c019d3.slice/crio-89630119b80a61cec4af4eee5b01256638d6f43b750c57311be43277c7f38226 WatchSource:0}: Error finding container 89630119b80a61cec4af4eee5b01256638d6f43b750c57311be43277c7f38226: Status 404 returned error can't find the container with id 89630119b80a61cec4af4eee5b01256638d6f43b750c57311be43277c7f38226 Apr 17 17:24:20.896135 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.896117 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:24:20.911556 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:20.911528 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:20.949715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.949658 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" event={"ID":"8cf3e3e8476c10b85dc36d1342e54de1","Type":"ContainerStarted","Data":"6d49eb364fa240ac54905cffbced6c10e0e9ca8ad89f206fde7fcc286d93213b"} Apr 17 17:24:20.950514 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:20.950490 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" event={"ID":"51db773c13b85f7ab11509c484c019d3","Type":"ContainerStarted","Data":"89630119b80a61cec4af4eee5b01256638d6f43b750c57311be43277c7f38226"} Apr 17 17:24:21.011771 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.011732 2565 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-87.ec2.internal\" not found" Apr 17 17:24:21.025347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.025325 2565 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:21.110169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.110078 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" Apr 17 17:24:21.120948 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.120919 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:21.122780 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.122766 2565 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" Apr 17 17:24:21.131146 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.131130 2565 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 17 17:24:21.224276 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.224237 2565 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:21.783567 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.783535 2565 apiserver.go:52] "Watching apiserver" Apr 17 17:24:21.791151 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.790464 2565 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 17 17:24:21.793422 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.793394 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh","openshift-dns/node-resolver-wq4vk","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal","openshift-multus/multus-86k48","openshift-ovn-kubernetes/ovnkube-node-25fqz","kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal","openshift-cluster-node-tuning-operator/tuned-j2tw4","openshift-image-registry/node-ca-mz8c6","openshift-multus/multus-additional-cni-plugins-4jwq8","openshift-multus/network-metrics-daemon-sx5fl","openshift-network-diagnostics/network-check-target-5vptw","openshift-network-operator/iptables-alerter-z8bdn","kube-system/konnectivity-agent-pzhcg"] Apr 17 17:24:21.796642 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.796615 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.797674 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.797655 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.798830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.798808 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.799152 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.799118 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 17 17:24:21.799360 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.799343 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-6w5bw\"" Apr 17 17:24:21.799489 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.799388 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 17 17:24:21.799593 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.799577 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-b926f\"" Apr 17 17:24:21.799683 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.799609 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.799762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.799747 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.800348 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.800328 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-86k48" Apr 17 17:24:21.800500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.800480 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.800986 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.800958 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.801125 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.801106 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-jwdsh\"" Apr 17 17:24:21.801557 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.801533 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.801742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.801720 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.802284 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.802237 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.802640 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.802606 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 17 17:24:21.802742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.802657 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 17 17:24:21.802742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.802697 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 17 17:24:21.802881 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.802848 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.802881 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.802859 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-42rvn\"" Apr 17 17:24:21.803103 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803086 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 17 17:24:21.803103 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803101 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.803314 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803188 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.803314 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803234 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 17 17:24:21.803611 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803589 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 17 17:24:21.803767 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803745 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.803767 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803762 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-r5w7h\"" Apr 17 17:24:21.803767 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803758 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.803970 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803911 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.804011 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803984 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 17 17:24:21.804011 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.803992 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-c4bww\"" Apr 17 17:24:21.804598 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.804581 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.805384 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.805126 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 17 17:24:21.805384 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.805207 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6fv27\"" Apr 17 17:24:21.805384 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.805261 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.805567 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.805438 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.806209 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.806025 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:21.806209 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.806090 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:21.806776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.806276 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 17 17:24:21.806928 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.806908 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 17 17:24:21.807010 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.806996 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-z4hj5\"" Apr 17 17:24:21.807400 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.807380 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:21.807471 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.807444 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:21.808727 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.808688 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.810513 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.810492 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 17 17:24:21.810613 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.810536 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 17 17:24:21.810613 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.810597 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-sx926\"" Apr 17 17:24:21.810743 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.810730 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:21.811298 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.811282 2565 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 17 17:24:21.823040 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-system-cni-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823132 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b8bc3fe-beaf-4464-9be8-84388596e77a-cni-binary-copy\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823132 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823079 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-cni-bin\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823133 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-etc-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.823226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823178 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-ovn\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.823226 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823205 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d881-6f86-4878-aefd-8707bc6216fb-tmp-dir\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.823381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823255 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-conf-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823292 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.823381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823320 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/490c70db-b612-47c1-a980-96f5b8cc2cbc-konnectivity-ca\") pod \"konnectivity-agent-pzhcg\" (UID: \"490c70db-b612-47c1-a980-96f5b8cc2cbc\") " pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.823381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823344 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-modprobe-d\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.823381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823369 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5dkxx\" (UniqueName: \"kubernetes.io/projected/72790912-9bff-41c5-8244-6a056c8fc59c-kube-api-access-5dkxx\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823394 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9fqvj\" (UniqueName: \"kubernetes.io/projected/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-kube-api-access-9fqvj\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823419 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fe5d881-6f86-4878-aefd-8707bc6216fb-hosts-file\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823439 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-kubelet\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823462 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-daemon-config\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823485 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-lib-modules\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823523 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-var-lib-kubelet\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823545 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-slash\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.823586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823568 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-systemd\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823591 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-os-release\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823653 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823691 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-hostroot\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823715 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8wk5w\" (UniqueName: \"kubernetes.io/projected/5b8bc3fe-beaf-4464-9be8-84388596e77a-kube-api-access-8wk5w\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823739 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysctl-d\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823762 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-host\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823781 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cni-binary-copy\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823795 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-etc-kubernetes\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823809 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823830 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rg9jm\" (UniqueName: \"kubernetes.io/projected/509a5284-811e-4026-9b46-fb36dcfe95d4-kube-api-access-rg9jm\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823853 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-var-lib-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823883 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-serviceca\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.823899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823895 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72790912-9bff-41c5-8244-6a056c8fc59c-tmp\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823914 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-kubelet\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823946 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-cni-netd\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823972 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.823999 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-ovnkube-config\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824021 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-socket-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824050 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vl8fc\" (UniqueName: \"kubernetes.io/projected/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-kube-api-access-vl8fc\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824068 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-sys\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824099 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-system-cni-dir\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824127 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t7tpf\" (UniqueName: \"kubernetes.io/projected/7fe5d881-6f86-4878-aefd-8707bc6216fb-kube-api-access-t7tpf\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824146 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-k8s-cni-cncf-io\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/509a5284-811e-4026-9b46-fb36dcfe95d4-iptables-alerter-script\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824277 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-kubernetes\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824345 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72790912-9bff-41c5-8244-6a056c8fc59c-etc-tuned\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.824374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-env-overrides\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824420 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-sys-fs\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824451 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-socket-dir-parent\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824468 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-multus-certs\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824482 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/490c70db-b612-47c1-a980-96f5b8cc2cbc-agent-certs\") pod \"konnectivity-agent-pzhcg\" (UID: \"490c70db-b612-47c1-a980-96f5b8cc2cbc\") " pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824500 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysctl-conf\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824520 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-run\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824546 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vw4md\" (UniqueName: \"kubernetes.io/projected/949deda6-6022-4b2b-898e-a6c55f650ba8-kube-api-access-vw4md\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824575 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-systemd\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824615 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-log-socket\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824660 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-host\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.825051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.824872 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-node-log\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825076 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-registration-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825234 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-etc-selinux\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825308 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cmrk2\" (UniqueName: \"kubernetes.io/projected/3a4d7fdb-21de-4fae-8073-07b155920dfd-kube-api-access-cmrk2\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825381 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825442 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-cni-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825481 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-systemd-units\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825531 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825581 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-run-ovn-kubernetes\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825629 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/949deda6-6022-4b2b-898e-a6c55f650ba8-ovn-node-metrics-cert\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825660 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-device-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825717 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cnibin\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825754 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825802 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-cnibin\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825836 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysconfig\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825865 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-run-netns\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.827913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825895 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-os-release\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.828854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825924 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-netns\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.828854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825950 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-cni-multus\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.828854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.825984 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mfzxw\" (UniqueName: \"kubernetes.io/projected/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-kube-api-access-mfzxw\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:21.828854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.826023 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/509a5284-811e-4026-9b46-fb36dcfe95d4-host-slash\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.828854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.826052 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-cni-bin\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.828854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.826088 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-ovnkube-script-lib\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.854985 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.854953 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:20 +0000 UTC" deadline="2027-10-01 16:09:46.109409923 +0000 UTC" Apr 17 17:24:21.855065 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.854988 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12766h45m24.254426368s" Apr 17 17:24:21.926365 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926333 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-cni-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926371 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-systemd-units\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926399 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926422 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-run-ovn-kubernetes\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926447 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/949deda6-6022-4b2b-898e-a6c55f650ba8-ovn-node-metrics-cert\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926461 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-cni-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926470 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-device-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926464 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-systemd-units\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926493 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cnibin\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.926542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926516 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926540 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-cnibin\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926597 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysconfig\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926621 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-run-netns\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926642 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-os-release\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926665 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-netns\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926689 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-cni-multus\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926715 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mfzxw\" (UniqueName: \"kubernetes.io/projected/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-kube-api-access-mfzxw\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926733 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-cnibin\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926743 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/509a5284-811e-4026-9b46-fb36dcfe95d4-host-slash\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926783 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-device-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926787 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-cni-bin\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926803 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926817 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-ovnkube-script-lib\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926822 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cnibin\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-system-cni-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926885 2565 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926897 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-run-ovn-kubernetes\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.926963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926908 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-system-cni-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926973 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-os-release\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927007 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/509a5284-811e-4026-9b46-fb36dcfe95d4-host-slash\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-run-netns\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927068 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-cni-bin\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927103 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-netns\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927178 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysconfig\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927213 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-cni-multus\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927472 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b8bc3fe-beaf-4464-9be8-84388596e77a-cni-binary-copy\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.926892 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5b8bc3fe-beaf-4464-9be8-84388596e77a-cni-binary-copy\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927590 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-cni-bin\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927643 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-etc-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927655 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-ovnkube-script-lib\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927683 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-ovn\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927709 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d881-6f86-4878-aefd-8707bc6216fb-tmp-dir\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927710 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-etc-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927750 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-conf-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.927848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927786 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/490c70db-b612-47c1-a980-96f5b8cc2cbc-konnectivity-ca\") pod \"konnectivity-agent-pzhcg\" (UID: \"490c70db-b612-47c1-a980-96f5b8cc2cbc\") " pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927822 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-modprobe-d\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927840 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5dkxx\" (UniqueName: \"kubernetes.io/projected/72790912-9bff-41c5-8244-6a056c8fc59c-kube-api-access-5dkxx\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927857 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9fqvj\" (UniqueName: \"kubernetes.io/projected/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-kube-api-access-9fqvj\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927875 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fe5d881-6f86-4878-aefd-8707bc6216fb-hosts-file\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927895 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-kubelet\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-daemon-config\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927916 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7fe5d881-6f86-4878-aefd-8707bc6216fb-tmp-dir\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927923 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927979 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-conf-dir\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.927979 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-kubelet\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928016 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-lib-modules\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928033 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/7fe5d881-6f86-4878-aefd-8707bc6216fb-hosts-file\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928038 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-var-lib-kubelet\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928061 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-slash\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928076 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-modprobe-d\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928086 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-systemd\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928117 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-var-lib-cni-bin\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.928711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928128 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-os-release\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928157 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928187 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-hostroot\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8wk5w\" (UniqueName: \"kubernetes.io/projected/5b8bc3fe-beaf-4464-9be8-84388596e77a-kube-api-access-8wk5w\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928298 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-hostroot\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928310 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysctl-d\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928334 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-systemd\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-host\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928372 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cni-binary-copy\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928398 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-etc-kubernetes\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928451 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rg9jm\" (UniqueName: \"kubernetes.io/projected/509a5284-811e-4026-9b46-fb36dcfe95d4-kube-api-access-rg9jm\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928477 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-var-lib-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928505 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-serviceca\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928531 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72790912-9bff-41c5-8244-6a056c8fc59c-tmp\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928567 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-daemon-config\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928611 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/490c70db-b612-47c1-a980-96f5b8cc2cbc-konnectivity-ca\") pod \"konnectivity-agent-pzhcg\" (UID: \"490c70db-b612-47c1-a980-96f5b8cc2cbc\") " pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928616 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-etc-kubernetes\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.929624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928661 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-lib-modules\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.928699 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-host\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928765 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-var-lib-kubelet\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.928774 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:24:22.428745661 +0000 UTC m=+3.086789952 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928801 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-slash\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928845 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-var-lib-openvswitch\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928984 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-tuning-conf-dir\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.928995 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-run-ovn\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929028 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-kubelet\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-os-release\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929048 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysctl-d\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929071 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-cni-netd\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929078 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-kubelet\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929120 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929121 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929154 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.930456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929165 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-ovnkube-config\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929123 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-host-cni-netd\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929192 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929233 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-kubelet-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929257 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-serviceca\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929264 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-socket-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929304 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vl8fc\" (UniqueName: \"kubernetes.io/projected/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-kube-api-access-vl8fc\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929348 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-sys\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-system-cni-dir\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929427 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t7tpf\" (UniqueName: \"kubernetes.io/projected/7fe5d881-6f86-4878-aefd-8707bc6216fb-kube-api-access-t7tpf\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929453 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-k8s-cni-cncf-io\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929471 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-sys\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929482 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/509a5284-811e-4026-9b46-fb36dcfe95d4-iptables-alerter-script\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929507 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-kubernetes\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929507 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-system-cni-dir\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929530 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72790912-9bff-41c5-8244-6a056c8fc59c-etc-tuned\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929548 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-k8s-cni-cncf-io\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.931237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929555 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-env-overrides\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-sys-fs\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929598 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-kubernetes\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929621 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-socket-dir-parent\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929647 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-ovnkube-config\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929649 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-multus-certs\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-host-run-multus-certs\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929691 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/490c70db-b612-47c1-a980-96f5b8cc2cbc-agent-certs\") pod \"konnectivity-agent-pzhcg\" (UID: \"490c70db-b612-47c1-a980-96f5b8cc2cbc\") " pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929718 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysctl-conf\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929726 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-sys-fs\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929742 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-run\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929769 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vw4md\" (UniqueName: \"kubernetes.io/projected/949deda6-6022-4b2b-898e-a6c55f650ba8-kube-api-access-vw4md\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929772 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/5b8bc3fe-beaf-4464-9be8-84388596e77a-multus-socket-dir-parent\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929817 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-cni-binary-copy\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929824 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-run\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929870 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-sysctl-conf\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929913 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/949deda6-6022-4b2b-898e-a6c55f650ba8-env-overrides\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.929971 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-systemd\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932064 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930003 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-log-socket\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930007 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/509a5284-811e-4026-9b46-fb36dcfe95d4-iptables-alerter-script\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930029 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-host\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930060 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-log-socket\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930066 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-host\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930031 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/72790912-9bff-41c5-8244-6a056c8fc59c-etc-systemd\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930056 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-node-log\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930109 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/949deda6-6022-4b2b-898e-a6c55f650ba8-node-log\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930117 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-registration-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930145 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-etc-selinux\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930149 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-socket-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930156 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-registration-dir\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930187 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cmrk2\" (UniqueName: \"kubernetes.io/projected/3a4d7fdb-21de-4fae-8073-07b155920dfd-kube-api-access-cmrk2\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930209 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3a4d7fdb-21de-4fae-8073-07b155920dfd-etc-selinux\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.930213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.932271 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/72790912-9bff-41c5-8244-6a056c8fc59c-tmp\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.932304 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/72790912-9bff-41c5-8244-6a056c8fc59c-etc-tuned\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.932939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.932667 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/949deda6-6022-4b2b-898e-a6c55f650ba8-ovn-node-metrics-cert\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.933784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.932671 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/490c70db-b612-47c1-a980-96f5b8cc2cbc-agent-certs\") pod \"konnectivity-agent-pzhcg\" (UID: \"490c70db-b612-47c1-a980-96f5b8cc2cbc\") " pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:21.940831 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.940801 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:21.940930 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.940835 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:21.940930 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.940851 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:21.941067 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:21.940951 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:22.440924245 +0000 UTC m=+3.098968538 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:21.942402 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.942377 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vl8fc\" (UniqueName: \"kubernetes.io/projected/5025af7c-9beb-4b9e-9e90-2e2d44ab467a-kube-api-access-vl8fc\") pod \"multus-additional-cni-plugins-4jwq8\" (UID: \"5025af7c-9beb-4b9e-9e90-2e2d44ab467a\") " pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:21.943217 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.943174 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9fqvj\" (UniqueName: \"kubernetes.io/projected/7842b7a8-6434-47ce-8ce5-d1c63dd11da1-kube-api-access-9fqvj\") pod \"node-ca-mz8c6\" (UID: \"7842b7a8-6434-47ce-8ce5-d1c63dd11da1\") " pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:21.943743 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.943693 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t7tpf\" (UniqueName: \"kubernetes.io/projected/7fe5d881-6f86-4878-aefd-8707bc6216fb-kube-api-access-t7tpf\") pod \"node-resolver-wq4vk\" (UID: \"7fe5d881-6f86-4878-aefd-8707bc6216fb\") " pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:21.944264 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.944211 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8wk5w\" (UniqueName: \"kubernetes.io/projected/5b8bc3fe-beaf-4464-9be8-84388596e77a-kube-api-access-8wk5w\") pod \"multus-86k48\" (UID: \"5b8bc3fe-beaf-4464-9be8-84388596e77a\") " pod="openshift-multus/multus-86k48" Apr 17 17:24:21.944641 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.944612 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5dkxx\" (UniqueName: \"kubernetes.io/projected/72790912-9bff-41c5-8244-6a056c8fc59c-kube-api-access-5dkxx\") pod \"tuned-j2tw4\" (UID: \"72790912-9bff-41c5-8244-6a056c8fc59c\") " pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:21.945124 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.945065 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cmrk2\" (UniqueName: \"kubernetes.io/projected/3a4d7fdb-21de-4fae-8073-07b155920dfd-kube-api-access-cmrk2\") pod \"aws-ebs-csi-driver-node-s7zwh\" (UID: \"3a4d7fdb-21de-4fae-8073-07b155920dfd\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:21.945207 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.945155 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mfzxw\" (UniqueName: \"kubernetes.io/projected/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-kube-api-access-mfzxw\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:21.945418 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.945376 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vw4md\" (UniqueName: \"kubernetes.io/projected/949deda6-6022-4b2b-898e-a6c55f650ba8-kube-api-access-vw4md\") pod \"ovnkube-node-25fqz\" (UID: \"949deda6-6022-4b2b-898e-a6c55f650ba8\") " pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:21.945829 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:21.945807 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rg9jm\" (UniqueName: \"kubernetes.io/projected/509a5284-811e-4026-9b46-fb36dcfe95d4-kube-api-access-rg9jm\") pod \"iptables-alerter-z8bdn\" (UID: \"509a5284-811e-4026-9b46-fb36dcfe95d4\") " pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:22.110999 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.110904 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:22.115847 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.115825 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wq4vk" Apr 17 17:24:22.124488 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.124464 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" Apr 17 17:24:22.131133 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.131112 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-86k48" Apr 17 17:24:22.138745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.138723 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:22.145359 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.145336 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" Apr 17 17:24:22.151913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.151888 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mz8c6" Apr 17 17:24:22.158403 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.158385 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" Apr 17 17:24:22.163902 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.163887 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-z8bdn" Apr 17 17:24:22.194856 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.194830 2565 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:22.196745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.196725 2565 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 17 17:24:22.432800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.432762 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:22.432997 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:22.432907 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:22.432997 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:22.432985 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:24:23.432967513 +0000 UTC m=+4.091011799 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:22.513119 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.513051 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5025af7c_9beb_4b9e_9e90_2e2d44ab467a.slice/crio-4acbf590da9841cd43586c8bb7e78007b1a70462b52f60642f9cacdb02db0858 WatchSource:0}: Error finding container 4acbf590da9841cd43586c8bb7e78007b1a70462b52f60642f9cacdb02db0858: Status 404 returned error can't find the container with id 4acbf590da9841cd43586c8bb7e78007b1a70462b52f60642f9cacdb02db0858 Apr 17 17:24:22.514595 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.514570 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72790912_9bff_41c5_8244_6a056c8fc59c.slice/crio-0753b2bcbeaa55726cf6b9b5e5644e7dd9b77384202115e0b88dd54b5c7ea8da WatchSource:0}: Error finding container 0753b2bcbeaa55726cf6b9b5e5644e7dd9b77384202115e0b88dd54b5c7ea8da: Status 404 returned error can't find the container with id 0753b2bcbeaa55726cf6b9b5e5644e7dd9b77384202115e0b88dd54b5c7ea8da Apr 17 17:24:22.515989 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.515964 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3a4d7fdb_21de_4fae_8073_07b155920dfd.slice/crio-809da8abcb4dcf6c35fb1f3cf76955da3bfccae83c9e99dbd2f5e0edb5e0103e WatchSource:0}: Error finding container 809da8abcb4dcf6c35fb1f3cf76955da3bfccae83c9e99dbd2f5e0edb5e0103e: Status 404 returned error can't find the container with id 809da8abcb4dcf6c35fb1f3cf76955da3bfccae83c9e99dbd2f5e0edb5e0103e Apr 17 17:24:22.517223 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.517133 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7842b7a8_6434_47ce_8ce5_d1c63dd11da1.slice/crio-4a5a1d013945de51fb4f15cf7492648cba1bc9b57d2af5d0febea06a166718a4 WatchSource:0}: Error finding container 4a5a1d013945de51fb4f15cf7492648cba1bc9b57d2af5d0febea06a166718a4: Status 404 returned error can't find the container with id 4a5a1d013945de51fb4f15cf7492648cba1bc9b57d2af5d0febea06a166718a4 Apr 17 17:24:22.519259 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.519033 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fe5d881_6f86_4878_aefd_8707bc6216fb.slice/crio-9dd638467e7988e48bbd2f92f681ca3c87c635d4b3ceac345e30ab63c9c0ed74 WatchSource:0}: Error finding container 9dd638467e7988e48bbd2f92f681ca3c87c635d4b3ceac345e30ab63c9c0ed74: Status 404 returned error can't find the container with id 9dd638467e7988e48bbd2f92f681ca3c87c635d4b3ceac345e30ab63c9c0ed74 Apr 17 17:24:22.520072 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.520053 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod949deda6_6022_4b2b_898e_a6c55f650ba8.slice/crio-9f59079d21b97a379f86759914aae2ffcc50b6fde328ab90b6b94e9a3de1738f WatchSource:0}: Error finding container 9f59079d21b97a379f86759914aae2ffcc50b6fde328ab90b6b94e9a3de1738f: Status 404 returned error can't find the container with id 9f59079d21b97a379f86759914aae2ffcc50b6fde328ab90b6b94e9a3de1738f Apr 17 17:24:22.520748 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.520709 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5b8bc3fe_beaf_4464_9be8_84388596e77a.slice/crio-f7403ee2d942b5e352f4cc5b2645b57e7445147190caab6d2ab735bc2caea36c WatchSource:0}: Error finding container f7403ee2d942b5e352f4cc5b2645b57e7445147190caab6d2ab735bc2caea36c: Status 404 returned error can't find the container with id f7403ee2d942b5e352f4cc5b2645b57e7445147190caab6d2ab735bc2caea36c Apr 17 17:24:22.522640 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.522619 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod509a5284_811e_4026_9b46_fb36dcfe95d4.slice/crio-d98a4651651b9fbff7f540a4e5ac6aefd721220075b03419fcd8dec8310335fd WatchSource:0}: Error finding container d98a4651651b9fbff7f540a4e5ac6aefd721220075b03419fcd8dec8310335fd: Status 404 returned error can't find the container with id d98a4651651b9fbff7f540a4e5ac6aefd721220075b03419fcd8dec8310335fd Apr 17 17:24:22.524036 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:22.524012 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod490c70db_b612_47c1_a980_96f5b8cc2cbc.slice/crio-31e8fe844bd292e21bd9a75bbee513454e95f19bb0ea569944afe37e4d35a7ad WatchSource:0}: Error finding container 31e8fe844bd292e21bd9a75bbee513454e95f19bb0ea569944afe37e4d35a7ad: Status 404 returned error can't find the container with id 31e8fe844bd292e21bd9a75bbee513454e95f19bb0ea569944afe37e4d35a7ad Apr 17 17:24:22.533324 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.533297 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:22.533463 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:22.533438 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:22.533463 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:22.533451 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:22.533463 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:22.533459 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:22.533569 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:22.533502 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:23.533486041 +0000 UTC m=+4.191530327 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:22.856012 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.855914 2565 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-16 17:19:20 +0000 UTC" deadline="2027-09-25 01:04:24.161080176 +0000 UTC" Apr 17 17:24:22.856012 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.855951 2565 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="12607h40m1.305133236s" Apr 17 17:24:22.962993 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.962932 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pzhcg" event={"ID":"490c70db-b612-47c1-a980-96f5b8cc2cbc","Type":"ContainerStarted","Data":"31e8fe844bd292e21bd9a75bbee513454e95f19bb0ea569944afe37e4d35a7ad"} Apr 17 17:24:22.969381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.969313 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z8bdn" event={"ID":"509a5284-811e-4026-9b46-fb36dcfe95d4","Type":"ContainerStarted","Data":"d98a4651651b9fbff7f540a4e5ac6aefd721220075b03419fcd8dec8310335fd"} Apr 17 17:24:22.971776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.971750 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"9f59079d21b97a379f86759914aae2ffcc50b6fde328ab90b6b94e9a3de1738f"} Apr 17 17:24:22.974972 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.974947 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mz8c6" event={"ID":"7842b7a8-6434-47ce-8ce5-d1c63dd11da1","Type":"ContainerStarted","Data":"4a5a1d013945de51fb4f15cf7492648cba1bc9b57d2af5d0febea06a166718a4"} Apr 17 17:24:22.985946 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.985919 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" event={"ID":"3a4d7fdb-21de-4fae-8073-07b155920dfd","Type":"ContainerStarted","Data":"809da8abcb4dcf6c35fb1f3cf76955da3bfccae83c9e99dbd2f5e0edb5e0103e"} Apr 17 17:24:22.990148 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:22.990101 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerStarted","Data":"4acbf590da9841cd43586c8bb7e78007b1a70462b52f60642f9cacdb02db0858"} Apr 17 17:24:23.001036 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.001008 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" event={"ID":"8cf3e3e8476c10b85dc36d1342e54de1","Type":"ContainerStarted","Data":"6b5acf0391bd349bd06c49672bfdbaf6047f454c1c653923fd1ccc2bbf3731ae"} Apr 17 17:24:23.003668 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.003630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-86k48" event={"ID":"5b8bc3fe-beaf-4464-9be8-84388596e77a","Type":"ContainerStarted","Data":"f7403ee2d942b5e352f4cc5b2645b57e7445147190caab6d2ab735bc2caea36c"} Apr 17 17:24:23.006762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.006735 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wq4vk" event={"ID":"7fe5d881-6f86-4878-aefd-8707bc6216fb","Type":"ContainerStarted","Data":"9dd638467e7988e48bbd2f92f681ca3c87c635d4b3ceac345e30ab63c9c0ed74"} Apr 17 17:24:23.017912 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.017882 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" event={"ID":"72790912-9bff-41c5-8244-6a056c8fc59c","Type":"ContainerStarted","Data":"0753b2bcbeaa55726cf6b9b5e5644e7dd9b77384202115e0b88dd54b5c7ea8da"} Apr 17 17:24:23.441379 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.440788 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:23.441379 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.440945 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:23.441379 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.441013 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:24:25.440992141 +0000 UTC m=+6.099036434 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:23.541992 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.541410 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:23.541992 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.541578 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:23.541992 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.541597 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:23.541992 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.541609 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:23.541992 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.541666 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:25.541647773 +0000 UTC m=+6.199692066 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:23.948236 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.948195 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:23.948684 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.948347 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:23.948684 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:23.948473 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:23.948684 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:23.948594 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:24.029627 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:24.029589 2565 generic.go:358] "Generic (PLEG): container finished" podID="51db773c13b85f7ab11509c484c019d3" containerID="717483dd9d3a5ee70170cf292847d9e1f355e980e5c6d6d2a4f688f71bef3c3d" exitCode=0 Apr 17 17:24:24.030559 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:24.030518 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" event={"ID":"51db773c13b85f7ab11509c484c019d3","Type":"ContainerDied","Data":"717483dd9d3a5ee70170cf292847d9e1f355e980e5c6d6d2a4f688f71bef3c3d"} Apr 17 17:24:24.057354 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:24.057289 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-87.ec2.internal" podStartSLOduration=3.057265935 podStartE2EDuration="3.057265935s" podCreationTimestamp="2026-04-17 17:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:23.015653016 +0000 UTC m=+3.673697325" watchObservedRunningTime="2026-04-17 17:24:24.057265935 +0000 UTC m=+4.715310241" Apr 17 17:24:25.042262 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:25.042214 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" event={"ID":"51db773c13b85f7ab11509c484c019d3","Type":"ContainerStarted","Data":"10e5ce395c8135ad18ed0c12f8bd45a1fa6b69cc50cfea3d0cfb9cad4f2e9290"} Apr 17 17:24:25.067012 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:25.066897 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-87.ec2.internal" podStartSLOduration=4.066876349 podStartE2EDuration="4.066876349s" podCreationTimestamp="2026-04-17 17:24:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:24:25.06680911 +0000 UTC m=+5.724853419" watchObservedRunningTime="2026-04-17 17:24:25.066876349 +0000 UTC m=+5.724920658" Apr 17 17:24:25.458331 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:25.458289 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:25.458570 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.458544 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:25.458665 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.458625 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:24:29.458604241 +0000 UTC m=+10.116648542 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:25.559508 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:25.559444 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:25.559681 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.559644 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:25.559681 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.559675 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:25.559806 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.559689 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:25.559806 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.559756 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:29.559735313 +0000 UTC m=+10.217779616 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:25.948511 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:25.948044 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:25.948511 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.948218 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:25.948511 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:25.948381 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:25.948511 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:25.948465 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:27.947288 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:27.947254 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:27.947722 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:27.947383 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:27.947722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:27.947439 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:27.947722 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:27.947558 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:29.492430 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:29.492386 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:29.492871 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.492568 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:29.492871 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.492633 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:24:37.492613118 +0000 UTC m=+18.150657405 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:29.593336 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:29.593296 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:29.593496 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.593462 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:29.593496 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.593481 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:29.593496 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.593494 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:29.593676 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.593550 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:37.593532454 +0000 UTC m=+18.251576764 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:29.948547 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:29.948036 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:29.948547 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.948147 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:29.948547 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:29.948533 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:29.948855 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:29.948681 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:31.947181 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:31.947143 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:31.947645 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:31.947157 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:31.947645 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:31.947344 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:31.947645 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:31.947486 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:33.947685 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:33.947583 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:33.948118 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:33.947718 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:33.948118 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:33.947583 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:33.948235 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:33.948183 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:35.947829 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:35.947786 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:35.948294 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:35.947917 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:35.948294 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:35.947968 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:35.948294 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:35.948083 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:37.543960 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:37.543923 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:37.544416 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.544039 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:37.544416 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.544103 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:24:53.544087588 +0000 UTC m=+34.202131877 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:37.645286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:37.645230 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:37.645453 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.645380 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:37.645453 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.645404 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:37.645453 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.645418 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:37.645630 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.645485 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:53.645468597 +0000 UTC m=+34.303512896 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:37.947688 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:37.947651 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:37.947871 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:37.947666 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:37.947871 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.947783 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:37.948050 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:37.947885 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:39.948454 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:39.948425 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:39.948809 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:39.948552 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:39.949079 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:39.949007 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:39.949158 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:39.949128 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:40.066440 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.066278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" event={"ID":"72790912-9bff-41c5-8244-6a056c8fc59c","Type":"ContainerStarted","Data":"551540cca28a8a75bf87c7062fbf1674635498cb8c5561958108c8b28c808b43"} Apr 17 17:24:40.067846 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.067765 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-pzhcg" event={"ID":"490c70db-b612-47c1-a980-96f5b8cc2cbc","Type":"ContainerStarted","Data":"7dd1f78c1823b7accc9f43fc0f4458504025c5067b686c381861338eb017b668"} Apr 17 17:24:40.069513 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.069491 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"687e1fedc86c023c4a022037fb03fc2157481561778f71804da5768eece58c2b"} Apr 17 17:24:40.073523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.073492 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mz8c6" event={"ID":"7842b7a8-6434-47ce-8ce5-d1c63dd11da1","Type":"ContainerStarted","Data":"218f5ba0183137f67252804c18edeff0ffdce80c09fa2ebd2e036139d359228f"} Apr 17 17:24:40.075028 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.075004 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" event={"ID":"3a4d7fdb-21de-4fae-8073-07b155920dfd","Type":"ContainerStarted","Data":"e4098a7be1b997ca17a8d4b8cc36efca6c1c7faecd8c2da2ab2855a66acf0a39"} Apr 17 17:24:40.076553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.076510 2565 generic.go:358] "Generic (PLEG): container finished" podID="5025af7c-9beb-4b9e-9e90-2e2d44ab467a" containerID="c525bd7891c3ac7b18be8925f0233fa56eebe6ffc0d345caf398d1cc1b8fb962" exitCode=0 Apr 17 17:24:40.076668 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.076649 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerDied","Data":"c525bd7891c3ac7b18be8925f0233fa56eebe6ffc0d345caf398d1cc1b8fb962"} Apr 17 17:24:40.078112 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.078087 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-86k48" event={"ID":"5b8bc3fe-beaf-4464-9be8-84388596e77a","Type":"ContainerStarted","Data":"757315e0c2119441cef04bc0084ee1982e846157821e746dbb5ee18a6102ee59"} Apr 17 17:24:40.079602 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.079579 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wq4vk" event={"ID":"7fe5d881-6f86-4878-aefd-8707bc6216fb","Type":"ContainerStarted","Data":"36d90ab397de0227e210dbf801c3cf6d1a055b970c310b392e06f0df7d5787c4"} Apr 17 17:24:40.083791 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.083747 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-j2tw4" podStartSLOduration=3.018322687 podStartE2EDuration="20.083731335s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.516639854 +0000 UTC m=+3.174684154" lastFinishedPulling="2026-04-17 17:24:39.582048513 +0000 UTC m=+20.240092802" observedRunningTime="2026-04-17 17:24:40.083106272 +0000 UTC m=+20.741150603" watchObservedRunningTime="2026-04-17 17:24:40.083731335 +0000 UTC m=+20.741775644" Apr 17 17:24:40.118318 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.118127 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-pzhcg" podStartSLOduration=4.090661287 podStartE2EDuration="21.118108892s" podCreationTimestamp="2026-04-17 17:24:19 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.526575918 +0000 UTC m=+3.184620204" lastFinishedPulling="2026-04-17 17:24:39.55402352 +0000 UTC m=+20.212067809" observedRunningTime="2026-04-17 17:24:40.117710265 +0000 UTC m=+20.775754573" watchObservedRunningTime="2026-04-17 17:24:40.118108892 +0000 UTC m=+20.776153202" Apr 17 17:24:40.137930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.137886 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-86k48" podStartSLOduration=3.046282742 podStartE2EDuration="20.13787136s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.523733401 +0000 UTC m=+3.181777694" lastFinishedPulling="2026-04-17 17:24:39.615322022 +0000 UTC m=+20.273366312" observedRunningTime="2026-04-17 17:24:40.137756979 +0000 UTC m=+20.795801286" watchObservedRunningTime="2026-04-17 17:24:40.13787136 +0000 UTC m=+20.795915667" Apr 17 17:24:40.150804 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.150767 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mz8c6" podStartSLOduration=7.792847323 podStartE2EDuration="20.15075402s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.519413247 +0000 UTC m=+3.177457542" lastFinishedPulling="2026-04-17 17:24:34.877319936 +0000 UTC m=+15.535364239" observedRunningTime="2026-04-17 17:24:40.150578041 +0000 UTC m=+20.808622348" watchObservedRunningTime="2026-04-17 17:24:40.15075402 +0000 UTC m=+20.808798328" Apr 17 17:24:40.164930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.164881 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wq4vk" podStartSLOduration=4.104828063 podStartE2EDuration="21.164862114s" podCreationTimestamp="2026-04-17 17:24:19 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.521975875 +0000 UTC m=+3.180020160" lastFinishedPulling="2026-04-17 17:24:39.582009924 +0000 UTC m=+20.240054211" observedRunningTime="2026-04-17 17:24:40.16469641 +0000 UTC m=+20.822740731" watchObservedRunningTime="2026-04-17 17:24:40.164862114 +0000 UTC m=+20.822906424" Apr 17 17:24:40.754913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.754885 2565 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 17 17:24:40.891776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.891650 2565 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-17T17:24:40.754909436Z","UUID":"1fed0dd3-edf5-4737-ba1b-1ef6e60f5255","Handler":null,"Name":"","Endpoint":""} Apr 17 17:24:40.894308 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.894282 2565 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 17 17:24:40.894446 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:40.894321 2565 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 17 17:24:41.083160 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.083126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-z8bdn" event={"ID":"509a5284-811e-4026-9b46-fb36dcfe95d4","Type":"ContainerStarted","Data":"469ef28987887c924b31e355120b108e21455f9098520c24b42b6f64a8176835"} Apr 17 17:24:41.086055 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.086020 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"d3f28e9a2416939199fd516d0bd4d4fdb08125f1664aaa94950ed2f63808b1c8"} Apr 17 17:24:41.086177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.086060 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"49c053238d42ebde3e9ce96f2a50793bd85978713789776ec45276eceb260b3e"} Apr 17 17:24:41.086177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.086075 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"d8fec4d34cc51a8745a92da766268d72ca7ed1c95462f77942e29a2cb31fc12a"} Apr 17 17:24:41.086177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.086085 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"c6eb103b397b738ac6b646257c1ca19344c77a14036fc8cf5613c454e7e903b8"} Apr 17 17:24:41.086177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.086093 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"5242afcb620d22f369fe85a32c4cbf906499fa75abfd0c2483944be223ed6690"} Apr 17 17:24:41.087752 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.087715 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" event={"ID":"3a4d7fdb-21de-4fae-8073-07b155920dfd","Type":"ContainerStarted","Data":"f611555b38622c62e5f422587a06d674cb71c90a65c84563cce96ef32e245415"} Apr 17 17:24:41.725399 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.725365 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:41.726135 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.726107 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:41.950532 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.950451 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:41.950711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:41.950451 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:41.950711 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:41.950576 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:41.950711 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:41.950624 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:42.091453 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:42.091416 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" event={"ID":"3a4d7fdb-21de-4fae-8073-07b155920dfd","Type":"ContainerStarted","Data":"288bc09e508e8f37d47e4cc50e280792984711950f1efe08ce65615542891ccf"} Apr 17 17:24:42.091903 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:42.091844 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:42.092120 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:42.092103 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-pzhcg" Apr 17 17:24:42.107474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:42.107426 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-z8bdn" podStartSLOduration=5.050143168 podStartE2EDuration="22.107410483s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.524743958 +0000 UTC m=+3.182788244" lastFinishedPulling="2026-04-17 17:24:39.582011259 +0000 UTC m=+20.240055559" observedRunningTime="2026-04-17 17:24:42.106966393 +0000 UTC m=+22.765010704" watchObservedRunningTime="2026-04-17 17:24:42.107410483 +0000 UTC m=+22.765454793" Apr 17 17:24:42.137478 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:42.137424 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-s7zwh" podStartSLOduration=2.994556707 podStartE2EDuration="22.137410713s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.51788634 +0000 UTC m=+3.175930625" lastFinishedPulling="2026-04-17 17:24:41.660740336 +0000 UTC m=+22.318784631" observedRunningTime="2026-04-17 17:24:42.136727736 +0000 UTC m=+22.794772044" watchObservedRunningTime="2026-04-17 17:24:42.137410713 +0000 UTC m=+22.795455056" Apr 17 17:24:43.097266 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:43.097009 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"f8dbcddf9f051817983e212c2a9caba91d8cd25f82f21d68323fac08f1b4052d"} Apr 17 17:24:43.947940 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:43.947906 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:43.948107 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:43.947906 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:43.948107 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:43.948010 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:43.948183 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:43.948108 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:45.104410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.104336 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" event={"ID":"949deda6-6022-4b2b-898e-a6c55f650ba8","Type":"ContainerStarted","Data":"fba41be8653b2f02736c1b17415cf2b1d21d043575bc302e376aee20dc3b2391"} Apr 17 17:24:45.105263 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.104673 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:45.107075 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.106398 2565 generic.go:358] "Generic (PLEG): container finished" podID="5025af7c-9beb-4b9e-9e90-2e2d44ab467a" containerID="14b48e8fe8bb39d522317baf87b2cb6dea7253989ae01cedd3503cfbe6ee889a" exitCode=0 Apr 17 17:24:45.107075 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.106469 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerDied","Data":"14b48e8fe8bb39d522317baf87b2cb6dea7253989ae01cedd3503cfbe6ee889a"} Apr 17 17:24:45.120575 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.120552 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:45.129905 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.129867 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" podStartSLOduration=7.712747332 podStartE2EDuration="25.129855196s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.522006731 +0000 UTC m=+3.180051032" lastFinishedPulling="2026-04-17 17:24:39.939114609 +0000 UTC m=+20.597158896" observedRunningTime="2026-04-17 17:24:45.12949271 +0000 UTC m=+25.787537017" watchObservedRunningTime="2026-04-17 17:24:45.129855196 +0000 UTC m=+25.787899507" Apr 17 17:24:45.949704 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.949642 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:45.949824 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:45.949644 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:45.949824 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:45.949754 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:45.949902 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:45.949837 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:46.110616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.110584 2565 generic.go:358] "Generic (PLEG): container finished" podID="5025af7c-9beb-4b9e-9e90-2e2d44ab467a" containerID="4b6e81ec41f3ca3098b07753fc65733985a7d0962cc943280190ba338f3d21fe" exitCode=0 Apr 17 17:24:46.111029 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.110668 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerDied","Data":"4b6e81ec41f3ca3098b07753fc65733985a7d0962cc943280190ba338f3d21fe"} Apr 17 17:24:46.111830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.111265 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:46.111830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.111291 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:46.125955 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.125933 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:24:46.407489 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.407462 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5vptw"] Apr 17 17:24:46.407653 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.407553 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:46.407653 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:46.407630 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:46.410561 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.410537 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sx5fl"] Apr 17 17:24:46.410651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:46.410617 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:46.410704 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:46.410689 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:47.114335 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:47.114071 2565 generic.go:358] "Generic (PLEG): container finished" podID="5025af7c-9beb-4b9e-9e90-2e2d44ab467a" containerID="1918cdfa847a89bf55722da3df8e1ba79e5ad4a18cfcb32dc3239a8d396ee6c2" exitCode=0 Apr 17 17:24:47.114335 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:47.114159 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerDied","Data":"1918cdfa847a89bf55722da3df8e1ba79e5ad4a18cfcb32dc3239a8d396ee6c2"} Apr 17 17:24:47.947087 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:47.947047 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:47.947087 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:47.947078 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:47.947357 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:47.947174 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:47.947357 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:47.947304 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:49.948746 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:49.948718 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:49.949464 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:49.948829 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:49.949464 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:49.948901 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:49.949464 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:49.948986 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:51.947943 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:51.947907 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:51.948409 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:51.947921 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:51.948409 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:51.948047 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-5vptw" podUID="912cdda1-8b21-4e90-bb49-4d95a21c8619" Apr 17 17:24:51.948409 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:51.948151 2565 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-sx5fl" podUID="a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa" Apr 17 17:24:52.641695 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.641670 2565 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-87.ec2.internal" event="NodeReady" Apr 17 17:24:52.641830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.641813 2565 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 17 17:24:52.690922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.690889 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-q2rfs"] Apr 17 17:24:52.704651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.704569 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-vvr96"] Apr 17 17:24:52.704820 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.704753 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.707488 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.707459 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-cl7ph\"" Apr 17 17:24:52.707615 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.707467 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 17 17:24:52.707615 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.707500 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 17 17:24:52.718185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.718166 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q2rfs"] Apr 17 17:24:52.718185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.718186 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vvr96"] Apr 17 17:24:52.718325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.718300 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:52.720608 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.720593 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-rb846\"" Apr 17 17:24:52.720901 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.720870 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 17 17:24:52.720976 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.720961 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 17 17:24:52.721217 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.721202 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 17 17:24:52.858321 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.858290 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:52.858321 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.858323 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.858512 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.858342 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h5ng8\" (UniqueName: \"kubernetes.io/projected/f3128c11-df72-45ff-b3c0-ac28a0f059c1-kube-api-access-h5ng8\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:52.858512 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.858391 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5843e29-8a21-4a19-ab43-f6529fb056ca-config-volume\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.858579 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.858503 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5843e29-8a21-4a19-ab43-f6529fb056ca-tmp-dir\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.858579 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.858545 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tkb58\" (UniqueName: \"kubernetes.io/projected/e5843e29-8a21-4a19-ab43-f6529fb056ca-kube-api-access-tkb58\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.959025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.958992 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tkb58\" (UniqueName: \"kubernetes.io/projected/e5843e29-8a21-4a19-ab43-f6529fb056ca-kube-api-access-tkb58\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959076 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959108 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959133 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h5ng8\" (UniqueName: \"kubernetes.io/projected/f3128c11-df72-45ff-b3c0-ac28a0f059c1-kube-api-access-h5ng8\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959163 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5843e29-8a21-4a19-ab43-f6529fb056ca-config-volume\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:52.959216 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959291 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5843e29-8a21-4a19-ab43-f6529fb056ca-tmp-dir\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:52.959301 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls podName:e5843e29-8a21-4a19-ab43-f6529fb056ca nodeName:}" failed. No retries permitted until 2026-04-17 17:24:53.459281201 +0000 UTC m=+34.117325499 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls") pod "dns-default-q2rfs" (UID: "e5843e29-8a21-4a19-ab43-f6529fb056ca") : secret "dns-default-metrics-tls" not found Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:52.959222 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:52.959500 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:52.959412 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert podName:f3128c11-df72-45ff-b3c0-ac28a0f059c1 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:53.459369578 +0000 UTC m=+34.117413868 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert") pod "ingress-canary-vvr96" (UID: "f3128c11-df72-45ff-b3c0-ac28a0f059c1") : secret "canary-serving-cert" not found Apr 17 17:24:52.959908 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959595 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/e5843e29-8a21-4a19-ab43-f6529fb056ca-tmp-dir\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.959908 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.959723 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/e5843e29-8a21-4a19-ab43-f6529fb056ca-config-volume\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:52.970273 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.970218 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h5ng8\" (UniqueName: \"kubernetes.io/projected/f3128c11-df72-45ff-b3c0-ac28a0f059c1-kube-api-access-h5ng8\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:52.970273 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:52.970264 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tkb58\" (UniqueName: \"kubernetes.io/projected/e5843e29-8a21-4a19-ab43-f6529fb056ca-kube-api-access-tkb58\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:53.128097 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.127916 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerStarted","Data":"c60ad5cd866d8c9106a7666d6a559f6748ba36543b4d640641224f9076331076"} Apr 17 17:24:53.463548 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.463468 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:53.463548 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.463513 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:53.463767 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.463609 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:53.463767 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.463612 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:53.463767 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.463661 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls podName:e5843e29-8a21-4a19-ab43-f6529fb056ca nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.46364711 +0000 UTC m=+35.121691396 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls") pod "dns-default-q2rfs" (UID: "e5843e29-8a21-4a19-ab43-f6529fb056ca") : secret "dns-default-metrics-tls" not found Apr 17 17:24:53.463767 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.463675 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert podName:f3128c11-df72-45ff-b3c0-ac28a0f059c1 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:54.463669069 +0000 UTC m=+35.121713355 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert") pod "ingress-canary-vvr96" (UID: "f3128c11-df72-45ff-b3c0-ac28a0f059c1") : secret "canary-serving-cert" not found Apr 17 17:24:53.564450 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.564415 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:53.564606 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.564555 2565 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:53.564649 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.564616 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs podName:a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa nodeName:}" failed. No retries permitted until 2026-04-17 17:25:25.564600565 +0000 UTC m=+66.222644850 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs") pod "network-metrics-daemon-sx5fl" (UID: "a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 17 17:24:53.665156 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.665121 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:53.665314 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.665290 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 17 17:24:53.665314 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.665310 2565 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 17 17:24:53.665385 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.665321 2565 projected.go:194] Error preparing data for projected volume kube-api-access-c5dcs for pod openshift-network-diagnostics/network-check-target-5vptw: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:53.665385 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:53.665371 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs podName:912cdda1-8b21-4e90-bb49-4d95a21c8619 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:25.665357363 +0000 UTC m=+66.323401648 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-c5dcs" (UniqueName: "kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs") pod "network-check-target-5vptw" (UID: "912cdda1-8b21-4e90-bb49-4d95a21c8619") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 17 17:24:53.947985 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.947947 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:24:53.947985 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.947980 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:24:53.950716 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.950694 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:24:53.950845 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.950764 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:24:53.950845 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.950780 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6nwcp\"" Apr 17 17:24:53.950845 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.950794 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:24:53.950978 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:53.950911 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m76bp\"" Apr 17 17:24:54.132760 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:54.132725 2565 generic.go:358] "Generic (PLEG): container finished" podID="5025af7c-9beb-4b9e-9e90-2e2d44ab467a" containerID="c60ad5cd866d8c9106a7666d6a559f6748ba36543b4d640641224f9076331076" exitCode=0 Apr 17 17:24:54.133185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:54.132771 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerDied","Data":"c60ad5cd866d8c9106a7666d6a559f6748ba36543b4d640641224f9076331076"} Apr 17 17:24:54.471785 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:54.471702 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:54.471785 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:54.471775 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:54.472231 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:54.471852 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:54.472231 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:54.471900 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert podName:f3128c11-df72-45ff-b3c0-ac28a0f059c1 nodeName:}" failed. No retries permitted until 2026-04-17 17:24:56.471887894 +0000 UTC m=+37.129932185 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert") pod "ingress-canary-vvr96" (UID: "f3128c11-df72-45ff-b3c0-ac28a0f059c1") : secret "canary-serving-cert" not found Apr 17 17:24:54.472231 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:54.471854 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:54.472231 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:54.471982 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls podName:e5843e29-8a21-4a19-ab43-f6529fb056ca nodeName:}" failed. No retries permitted until 2026-04-17 17:24:56.471966299 +0000 UTC m=+37.130010584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls") pod "dns-default-q2rfs" (UID: "e5843e29-8a21-4a19-ab43-f6529fb056ca") : secret "dns-default-metrics-tls" not found Apr 17 17:24:55.137292 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:55.137254 2565 generic.go:358] "Generic (PLEG): container finished" podID="5025af7c-9beb-4b9e-9e90-2e2d44ab467a" containerID="b6ff70af151c18aa09b105e63b22aa2e7ef386c81bf1841229abf7302d1ce687" exitCode=0 Apr 17 17:24:55.137796 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:55.137308 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerDied","Data":"b6ff70af151c18aa09b105e63b22aa2e7ef386c81bf1841229abf7302d1ce687"} Apr 17 17:24:56.143748 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.143719 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" event={"ID":"5025af7c-9beb-4b9e-9e90-2e2d44ab467a","Type":"ContainerStarted","Data":"507abe7ea2b24237d5a92e6d11713dad45321f2c99b052e869edad38aacfb3cf"} Apr 17 17:24:56.173154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.173103 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-4jwq8" podStartSLOduration=5.776023323 podStartE2EDuration="36.173089845s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:24:22.514710822 +0000 UTC m=+3.172755108" lastFinishedPulling="2026-04-17 17:24:52.911777342 +0000 UTC m=+33.569821630" observedRunningTime="2026-04-17 17:24:56.171871667 +0000 UTC m=+36.829915975" watchObservedRunningTime="2026-04-17 17:24:56.173089845 +0000 UTC m=+36.831134153" Apr 17 17:24:56.486136 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.486102 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:24:56.486136 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.486137 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:24:56.486357 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:56.486226 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:24:56.486357 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:56.486234 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:24:56.486357 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:56.486299 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls podName:e5843e29-8a21-4a19-ab43-f6529fb056ca nodeName:}" failed. No retries permitted until 2026-04-17 17:25:00.486281146 +0000 UTC m=+41.144325434 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls") pod "dns-default-q2rfs" (UID: "e5843e29-8a21-4a19-ab43-f6529fb056ca") : secret "dns-default-metrics-tls" not found Apr 17 17:24:56.486357 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:24:56.486318 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert podName:f3128c11-df72-45ff-b3c0-ac28a0f059c1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:00.486311126 +0000 UTC m=+41.144355411 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert") pod "ingress-canary-vvr96" (UID: "f3128c11-df72-45ff-b3c0-ac28a0f059c1") : secret "canary-serving-cert" not found Apr 17 17:24:56.579482 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.579450 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242"] Apr 17 17:24:56.602523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.602492 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242"] Apr 17 17:24:56.602659 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.602551 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" Apr 17 17:24:56.605418 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.605397 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-d5gw2\"" Apr 17 17:24:56.605832 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.605818 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 17 17:24:56.605935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.605916 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 17 17:24:56.687517 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.687483 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qqqtq\" (UniqueName: \"kubernetes.io/projected/b7946c65-567b-4c5b-9c52-c4a315fd0c39-kube-api-access-qqqtq\") pod \"migrator-74bb7799d9-hm242\" (UID: \"b7946c65-567b-4c5b-9c52-c4a315fd0c39\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" Apr 17 17:24:56.788833 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.788731 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qqqtq\" (UniqueName: \"kubernetes.io/projected/b7946c65-567b-4c5b-9c52-c4a315fd0c39-kube-api-access-qqqtq\") pod \"migrator-74bb7799d9-hm242\" (UID: \"b7946c65-567b-4c5b-9c52-c4a315fd0c39\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" Apr 17 17:24:56.803416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.803390 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qqqtq\" (UniqueName: \"kubernetes.io/projected/b7946c65-567b-4c5b-9c52-c4a315fd0c39-kube-api-access-qqqtq\") pod \"migrator-74bb7799d9-hm242\" (UID: \"b7946c65-567b-4c5b-9c52-c4a315fd0c39\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" Apr 17 17:24:56.911684 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:56.911646 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" Apr 17 17:24:57.075074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.074816 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242"] Apr 17 17:24:57.079219 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:57.079193 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb7946c65_567b_4c5b_9c52_c4a315fd0c39.slice/crio-97762b4f60fcbf29915a878c43ca81b18ac819c2f5784e4d2ef57240f6c3e372 WatchSource:0}: Error finding container 97762b4f60fcbf29915a878c43ca81b18ac819c2f5784e4d2ef57240f6c3e372: Status 404 returned error can't find the container with id 97762b4f60fcbf29915a878c43ca81b18ac819c2f5784e4d2ef57240f6c3e372 Apr 17 17:24:57.146982 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.146951 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" event={"ID":"b7946c65-567b-4c5b-9c52-c4a315fd0c39","Type":"ContainerStarted","Data":"97762b4f60fcbf29915a878c43ca81b18ac819c2f5784e4d2ef57240f6c3e372"} Apr 17 17:24:57.289197 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.289161 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kgg4d"] Apr 17 17:24:57.303194 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.303170 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kgg4d"] Apr 17 17:24:57.303374 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.303288 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.305549 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.305525 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 17 17:24:57.305701 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.305675 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 17 17:24:57.305811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.305727 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 17 17:24:57.305811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.305745 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 17 17:24:57.306069 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.305853 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-qqr94\"" Apr 17 17:24:57.393949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.393922 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7d965f6d-b845-48d8-b08e-0d802a123168-signing-key\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.394101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.393967 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7d965f6d-b845-48d8-b08e-0d802a123168-signing-cabundle\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.394101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.394020 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l7l45\" (UniqueName: \"kubernetes.io/projected/7d965f6d-b845-48d8-b08e-0d802a123168-kube-api-access-l7l45\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.495215 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.495181 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7d965f6d-b845-48d8-b08e-0d802a123168-signing-key\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.495401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.495231 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7d965f6d-b845-48d8-b08e-0d802a123168-signing-cabundle\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.495401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.495349 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l7l45\" (UniqueName: \"kubernetes.io/projected/7d965f6d-b845-48d8-b08e-0d802a123168-kube-api-access-l7l45\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.495985 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.495968 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/7d965f6d-b845-48d8-b08e-0d802a123168-signing-cabundle\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.498542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.498523 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/7d965f6d-b845-48d8-b08e-0d802a123168-signing-key\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.503971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.503950 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l7l45\" (UniqueName: \"kubernetes.io/projected/7d965f6d-b845-48d8-b08e-0d802a123168-kube-api-access-l7l45\") pod \"service-ca-865cb79987-kgg4d\" (UID: \"7d965f6d-b845-48d8-b08e-0d802a123168\") " pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.612861 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.612829 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-kgg4d" Apr 17 17:24:57.712808 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.712773 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wq4vk_7fe5d881-6f86-4878-aefd-8707bc6216fb/dns-node-resolver/0.log" Apr 17 17:24:57.731589 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:57.731514 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-kgg4d"] Apr 17 17:24:57.735352 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:24:57.735319 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d965f6d_b845_48d8_b08e_0d802a123168.slice/crio-3b23fe135bf7ad00f26001c62ff9a7091f6f5693f92cae39e7adeab6634704c2 WatchSource:0}: Error finding container 3b23fe135bf7ad00f26001c62ff9a7091f6f5693f92cae39e7adeab6634704c2: Status 404 returned error can't find the container with id 3b23fe135bf7ad00f26001c62ff9a7091f6f5693f92cae39e7adeab6634704c2 Apr 17 17:24:58.151547 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:58.151503 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kgg4d" event={"ID":"7d965f6d-b845-48d8-b08e-0d802a123168","Type":"ContainerStarted","Data":"3b23fe135bf7ad00f26001c62ff9a7091f6f5693f92cae39e7adeab6634704c2"} Apr 17 17:24:58.313407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:58.313377 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mz8c6_7842b7a8-6434-47ce-8ce5-d1c63dd11da1/node-ca/0.log" Apr 17 17:24:59.155425 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:59.155386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" event={"ID":"b7946c65-567b-4c5b-9c52-c4a315fd0c39","Type":"ContainerStarted","Data":"ee09048defdf09b731ea430ac6f36d5d4d30635ba642e2a283986e117a1caff8"} Apr 17 17:24:59.155425 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:59.155426 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" event={"ID":"b7946c65-567b-4c5b-9c52-c4a315fd0c39","Type":"ContainerStarted","Data":"a8da09162dbb40129e9eab98b3d090f398da47fce6882f9b2d92e3c0dfd57c06"} Apr 17 17:24:59.173647 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:24:59.173582 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-hm242" podStartSLOduration=1.456264988 podStartE2EDuration="3.173560495s" podCreationTimestamp="2026-04-17 17:24:56 +0000 UTC" firstStartedPulling="2026-04-17 17:24:57.080982847 +0000 UTC m=+37.739027133" lastFinishedPulling="2026-04-17 17:24:58.798278351 +0000 UTC m=+39.456322640" observedRunningTime="2026-04-17 17:24:59.172615212 +0000 UTC m=+39.830659524" watchObservedRunningTime="2026-04-17 17:24:59.173560495 +0000 UTC m=+39.831604804" Apr 17 17:25:00.158565 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:00.158496 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-kgg4d" event={"ID":"7d965f6d-b845-48d8-b08e-0d802a123168","Type":"ContainerStarted","Data":"f5575a8fcc6f85ece44106c1c6604ca8d408e3b56dbdc26cb798dde41b56d892"} Apr 17 17:25:00.175694 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:00.175637 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-kgg4d" podStartSLOduration=0.884939731 podStartE2EDuration="3.175618443s" podCreationTimestamp="2026-04-17 17:24:57 +0000 UTC" firstStartedPulling="2026-04-17 17:24:57.737521486 +0000 UTC m=+38.395565771" lastFinishedPulling="2026-04-17 17:25:00.028200197 +0000 UTC m=+40.686244483" observedRunningTime="2026-04-17 17:25:00.174757741 +0000 UTC m=+40.832802062" watchObservedRunningTime="2026-04-17 17:25:00.175618443 +0000 UTC m=+40.833662755" Apr 17 17:25:00.520220 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:00.520180 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:25:00.520220 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:00.520221 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:25:00.520434 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:00.520345 2565 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 17 17:25:00.520434 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:00.520396 2565 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 17 17:25:00.520434 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:00.520401 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert podName:f3128c11-df72-45ff-b3c0-ac28a0f059c1 nodeName:}" failed. No retries permitted until 2026-04-17 17:25:08.520386939 +0000 UTC m=+49.178431225 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert") pod "ingress-canary-vvr96" (UID: "f3128c11-df72-45ff-b3c0-ac28a0f059c1") : secret "canary-serving-cert" not found Apr 17 17:25:00.520434 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:00.520436 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls podName:e5843e29-8a21-4a19-ab43-f6529fb056ca nodeName:}" failed. No retries permitted until 2026-04-17 17:25:08.520424932 +0000 UTC m=+49.178469218 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls") pod "dns-default-q2rfs" (UID: "e5843e29-8a21-4a19-ab43-f6529fb056ca") : secret "dns-default-metrics-tls" not found Apr 17 17:25:08.583791 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.583752 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:25:08.583791 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.583793 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:25:08.586259 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.586217 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/e5843e29-8a21-4a19-ab43-f6529fb056ca-metrics-tls\") pod \"dns-default-q2rfs\" (UID: \"e5843e29-8a21-4a19-ab43-f6529fb056ca\") " pod="openshift-dns/dns-default-q2rfs" Apr 17 17:25:08.586426 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.586410 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/f3128c11-df72-45ff-b3c0-ac28a0f059c1-cert\") pod \"ingress-canary-vvr96\" (UID: \"f3128c11-df72-45ff-b3c0-ac28a0f059c1\") " pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:25:08.615381 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.615353 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-q2rfs" Apr 17 17:25:08.626089 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.626059 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-vvr96" Apr 17 17:25:08.768956 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.768928 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-q2rfs"] Apr 17 17:25:08.772421 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:08.772397 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode5843e29_8a21_4a19_ab43_f6529fb056ca.slice/crio-aff6e86fd6d145bed0599c7add901393dcbaf4f6bfd0a11fcab2aa8c0d7308ef WatchSource:0}: Error finding container aff6e86fd6d145bed0599c7add901393dcbaf4f6bfd0a11fcab2aa8c0d7308ef: Status 404 returned error can't find the container with id aff6e86fd6d145bed0599c7add901393dcbaf4f6bfd0a11fcab2aa8c0d7308ef Apr 17 17:25:08.783295 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:08.783272 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-vvr96"] Apr 17 17:25:08.786468 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:08.786446 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf3128c11_df72_45ff_b3c0_ac28a0f059c1.slice/crio-6257086bb018ea1974c0e8d137b916c904a22cba23fd26ab231e380910de67ed WatchSource:0}: Error finding container 6257086bb018ea1974c0e8d137b916c904a22cba23fd26ab231e380910de67ed: Status 404 returned error can't find the container with id 6257086bb018ea1974c0e8d137b916c904a22cba23fd26ab231e380910de67ed Apr 17 17:25:09.176638 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:09.176598 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vvr96" event={"ID":"f3128c11-df72-45ff-b3c0-ac28a0f059c1","Type":"ContainerStarted","Data":"6257086bb018ea1974c0e8d137b916c904a22cba23fd26ab231e380910de67ed"} Apr 17 17:25:09.177571 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:09.177548 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q2rfs" event={"ID":"e5843e29-8a21-4a19-ab43-f6529fb056ca","Type":"ContainerStarted","Data":"aff6e86fd6d145bed0599c7add901393dcbaf4f6bfd0a11fcab2aa8c0d7308ef"} Apr 17 17:25:11.184207 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:11.184129 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-vvr96" event={"ID":"f3128c11-df72-45ff-b3c0-ac28a0f059c1","Type":"ContainerStarted","Data":"065e9081827196610c556874a8c1d945ef2d68dedf2ff239b7d7c3851c76824f"} Apr 17 17:25:11.186262 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:11.186189 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q2rfs" event={"ID":"e5843e29-8a21-4a19-ab43-f6529fb056ca","Type":"ContainerStarted","Data":"7f495bdf981c52200a12b9f73343a2d3ca0ac39358728878482794fae064982f"} Apr 17 17:25:11.202122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:11.202076 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-vvr96" podStartSLOduration=16.997912697 podStartE2EDuration="19.202064495s" podCreationTimestamp="2026-04-17 17:24:52 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.788392179 +0000 UTC m=+49.446436469" lastFinishedPulling="2026-04-17 17:25:10.992543981 +0000 UTC m=+51.650588267" observedRunningTime="2026-04-17 17:25:11.200610038 +0000 UTC m=+51.858654348" watchObservedRunningTime="2026-04-17 17:25:11.202064495 +0000 UTC m=+51.860108802" Apr 17 17:25:12.189980 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:12.189938 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-q2rfs" event={"ID":"e5843e29-8a21-4a19-ab43-f6529fb056ca","Type":"ContainerStarted","Data":"6df6b5b8d318bc15ec4b021bc0f56a7d4dcccceabe7e8dd412edbee9f81aae38"} Apr 17 17:25:12.209751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:12.209690 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-q2rfs" podStartSLOduration=17.996944595 podStartE2EDuration="20.209674689s" podCreationTimestamp="2026-04-17 17:24:52 +0000 UTC" firstStartedPulling="2026-04-17 17:25:08.774354989 +0000 UTC m=+49.432399278" lastFinishedPulling="2026-04-17 17:25:10.987085068 +0000 UTC m=+51.645129372" observedRunningTime="2026-04-17 17:25:12.208633894 +0000 UTC m=+52.866678214" watchObservedRunningTime="2026-04-17 17:25:12.209674689 +0000 UTC m=+52.867718996" Apr 17 17:25:13.193482 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:13.193451 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-q2rfs" Apr 17 17:25:18.127868 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:18.127835 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-25fqz" Apr 17 17:25:20.680854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.680818 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh"] Apr 17 17:25:20.685599 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.685577 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w"] Apr 17 17:25:20.685751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.685734 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.688695 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.688673 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.689019 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.688999 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 17 17:25:20.689131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.689004 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 17 17:25:20.689131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.689093 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 17 17:25:20.691657 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.691639 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 17 17:25:20.692621 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.692603 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-85zbd\"" Apr 17 17:25:20.696937 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.696919 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 17 17:25:20.699969 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.699949 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh"] Apr 17 17:25:20.708617 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.708594 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w"] Apr 17 17:25:20.767475 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.767444 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-tmp\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.767475 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.767478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdcph\" (UniqueName: \"kubernetes.io/projected/f5a7faea-7a56-43f6-bd25-8e96fe627a31-kube-api-access-gdcph\") pod \"managed-serviceaccount-addon-agent-5b4fc94749-gkz5w\" (UID: \"f5a7faea-7a56-43f6-bd25-8e96fe627a31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.767722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.767499 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkkc4\" (UniqueName: \"kubernetes.io/projected/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-kube-api-access-dkkc4\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.767722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.767615 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f5a7faea-7a56-43f6-bd25-8e96fe627a31-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5b4fc94749-gkz5w\" (UID: \"f5a7faea-7a56-43f6-bd25-8e96fe627a31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.767722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.767657 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.860972 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.860938 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-jrddb"] Apr 17 17:25:20.864038 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.864022 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.868731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868699 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f8a29699-81b9-4195-bf3c-aec014a689f0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.868866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868739 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.868866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868794 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8a29699-81b9-4195-bf3c-aec014a689f0-data-volume\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.868866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868827 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-tmp\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.868866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868852 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnf7s\" (UniqueName: \"kubernetes.io/projected/f8a29699-81b9-4195-bf3c-aec014a689f0-kube-api-access-cnf7s\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.869077 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868885 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdcph\" (UniqueName: \"kubernetes.io/projected/f5a7faea-7a56-43f6-bd25-8e96fe627a31-kube-api-access-gdcph\") pod \"managed-serviceaccount-addon-agent-5b4fc94749-gkz5w\" (UID: \"f5a7faea-7a56-43f6-bd25-8e96fe627a31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.869077 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868911 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkkc4\" (UniqueName: \"kubernetes.io/projected/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-kube-api-access-dkkc4\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.869077 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.868979 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f8a29699-81b9-4195-bf3c-aec014a689f0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.869077 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.869032 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f8a29699-81b9-4195-bf3c-aec014a689f0-crio-socket\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.869315 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.869078 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f5a7faea-7a56-43f6-bd25-8e96fe627a31-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5b4fc94749-gkz5w\" (UID: \"f5a7faea-7a56-43f6-bd25-8e96fe627a31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.869315 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.869192 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-tmp\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.871626 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.871606 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/f5a7faea-7a56-43f6-bd25-8e96fe627a31-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-5b4fc94749-gkz5w\" (UID: \"f5a7faea-7a56-43f6-bd25-8e96fe627a31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.871713 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.871616 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-klusterlet-config\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.879389 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.879365 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 17 17:25:20.879506 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.879441 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 17 17:25:20.879506 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.879491 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-42cdb\"" Apr 17 17:25:20.879506 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.879500 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 17 17:25:20.879648 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.879494 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 17 17:25:20.898074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.898052 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jrddb"] Apr 17 17:25:20.904920 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.904893 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkkc4\" (UniqueName: \"kubernetes.io/projected/ef67b48f-c190-4025-9e62-fb31e9c1b8fa-kube-api-access-dkkc4\") pod \"klusterlet-addon-workmgr-7bfb98874b-9v6mh\" (UID: \"ef67b48f-c190-4025-9e62-fb31e9c1b8fa\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:20.906025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.906006 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdcph\" (UniqueName: \"kubernetes.io/projected/f5a7faea-7a56-43f6-bd25-8e96fe627a31-kube-api-access-gdcph\") pod \"managed-serviceaccount-addon-agent-5b4fc94749-gkz5w\" (UID: \"f5a7faea-7a56-43f6-bd25-8e96fe627a31\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:20.931135 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.931063 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-5b896476cd-bjzsc"] Apr 17 17:25:20.934190 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.934170 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.941193 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.941176 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 17 17:25:20.941298 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.941194 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 17 17:25:20.942401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.942384 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-vv5sf\"" Apr 17 17:25:20.942581 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.942565 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 17 17:25:20.951057 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.951037 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 17 17:25:20.965679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.965656 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b896476cd-bjzsc"] Apr 17 17:25:20.969809 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.969786 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f8a29699-81b9-4195-bf3c-aec014a689f0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.969909 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.969819 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eba52262-b7bd-445f-9a9f-a8688fc5e324-trusted-ca\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.969909 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.969838 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eba52262-b7bd-445f-9a9f-a8688fc5e324-installation-pull-secrets\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.969909 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.969855 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-bound-sa-token\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.970021 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.969909 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f8a29699-81b9-4195-bf3c-aec014a689f0-crio-socket\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.970055 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970035 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eba52262-b7bd-445f-9a9f-a8688fc5e324-ca-trust-extracted\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.970095 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970077 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eba52262-b7bd-445f-9a9f-a8688fc5e324-registry-certificates\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.970133 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970110 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpv7l\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-kube-api-access-dpv7l\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.970169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970141 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f8a29699-81b9-4195-bf3c-aec014a689f0-crio-socket\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.970169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970143 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f8a29699-81b9-4195-bf3c-aec014a689f0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.970257 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970215 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8a29699-81b9-4195-bf3c-aec014a689f0-data-volume\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.970301 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970239 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eba52262-b7bd-445f-9a9f-a8688fc5e324-image-registry-private-configuration\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.970301 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970284 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-registry-tls\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:20.970375 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970305 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnf7s\" (UniqueName: \"kubernetes.io/projected/f8a29699-81b9-4195-bf3c-aec014a689f0-kube-api-access-cnf7s\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.970512 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f8a29699-81b9-4195-bf3c-aec014a689f0-data-volume\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.970701 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.970686 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f8a29699-81b9-4195-bf3c-aec014a689f0-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.972131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.972104 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f8a29699-81b9-4195-bf3c-aec014a689f0-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.996491 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.996467 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnf7s\" (UniqueName: \"kubernetes.io/projected/f8a29699-81b9-4195-bf3c-aec014a689f0-kube-api-access-cnf7s\") pod \"insights-runtime-extractor-jrddb\" (UID: \"f8a29699-81b9-4195-bf3c-aec014a689f0\") " pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:20.996608 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:20.996566 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:21.024075 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.024044 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" Apr 17 17:25:21.071175 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071147 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eba52262-b7bd-445f-9a9f-a8688fc5e324-registry-certificates\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071352 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071180 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpv7l\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-kube-api-access-dpv7l\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071352 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071226 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eba52262-b7bd-445f-9a9f-a8688fc5e324-image-registry-private-configuration\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071352 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071258 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-registry-tls\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071352 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071299 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eba52262-b7bd-445f-9a9f-a8688fc5e324-trusted-ca\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071352 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071323 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eba52262-b7bd-445f-9a9f-a8688fc5e324-installation-pull-secrets\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071352 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071345 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-bound-sa-token\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.071641 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.071381 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eba52262-b7bd-445f-9a9f-a8688fc5e324-ca-trust-extracted\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.072066 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.072013 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/eba52262-b7bd-445f-9a9f-a8688fc5e324-ca-trust-extracted\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.072687 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.072662 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/eba52262-b7bd-445f-9a9f-a8688fc5e324-trusted-ca\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.072832 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.072800 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/eba52262-b7bd-445f-9a9f-a8688fc5e324-registry-certificates\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.074070 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.074047 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/eba52262-b7bd-445f-9a9f-a8688fc5e324-image-registry-private-configuration\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.074459 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.074369 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/eba52262-b7bd-445f-9a9f-a8688fc5e324-installation-pull-secrets\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.074772 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.074755 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-registry-tls\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.093028 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.092999 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpv7l\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-kube-api-access-dpv7l\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.093177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.093060 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/eba52262-b7bd-445f-9a9f-a8688fc5e324-bound-sa-token\") pod \"image-registry-5b896476cd-bjzsc\" (UID: \"eba52262-b7bd-445f-9a9f-a8688fc5e324\") " pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.152729 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.152699 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh"] Apr 17 17:25:21.156096 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:21.156067 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podef67b48f_c190_4025_9e62_fb31e9c1b8fa.slice/crio-645751dec0e5847f2ae4cdc0390f52ad612793864569e7ecb755f0fdd58be410 WatchSource:0}: Error finding container 645751dec0e5847f2ae4cdc0390f52ad612793864569e7ecb755f0fdd58be410: Status 404 returned error can't find the container with id 645751dec0e5847f2ae4cdc0390f52ad612793864569e7ecb755f0fdd58be410 Apr 17 17:25:21.170033 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.170009 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w"] Apr 17 17:25:21.172141 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.172124 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-jrddb" Apr 17 17:25:21.173814 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:21.173795 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podf5a7faea_7a56_43f6_bd25_8e96fe627a31.slice/crio-d34be73e7de0251bd12937f9c57e041741b8846b70407d480a2efdf24cf9acb2 WatchSource:0}: Error finding container d34be73e7de0251bd12937f9c57e041741b8846b70407d480a2efdf24cf9acb2: Status 404 returned error can't find the container with id d34be73e7de0251bd12937f9c57e041741b8846b70407d480a2efdf24cf9acb2 Apr 17 17:25:21.213368 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.213278 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" event={"ID":"f5a7faea-7a56-43f6-bd25-8e96fe627a31","Type":"ContainerStarted","Data":"d34be73e7de0251bd12937f9c57e041741b8846b70407d480a2efdf24cf9acb2"} Apr 17 17:25:21.214173 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.214148 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" event={"ID":"ef67b48f-c190-4025-9e62-fb31e9c1b8fa","Type":"ContainerStarted","Data":"645751dec0e5847f2ae4cdc0390f52ad612793864569e7ecb755f0fdd58be410"} Apr 17 17:25:21.242555 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.242526 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:21.295431 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.295400 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-jrddb"] Apr 17 17:25:21.299844 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:21.299816 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf8a29699_81b9_4195_bf3c_aec014a689f0.slice/crio-ba808679cd4df33cf66d7966c1355d35f32ea310a3499584ff60e29d5e0a33fc WatchSource:0}: Error finding container ba808679cd4df33cf66d7966c1355d35f32ea310a3499584ff60e29d5e0a33fc: Status 404 returned error can't find the container with id ba808679cd4df33cf66d7966c1355d35f32ea310a3499584ff60e29d5e0a33fc Apr 17 17:25:21.371619 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:21.371584 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-5b896476cd-bjzsc"] Apr 17 17:25:21.376899 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:21.376872 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeba52262_b7bd_445f_9a9f_a8688fc5e324.slice/crio-d36e32b2b1251dcc512ff661d08f27a3db3a7f8a9c65a368479d051b8a00ffb9 WatchSource:0}: Error finding container d36e32b2b1251dcc512ff661d08f27a3db3a7f8a9c65a368479d051b8a00ffb9: Status 404 returned error can't find the container with id d36e32b2b1251dcc512ff661d08f27a3db3a7f8a9c65a368479d051b8a00ffb9 Apr 17 17:25:22.219971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:22.219907 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jrddb" event={"ID":"f8a29699-81b9-4195-bf3c-aec014a689f0","Type":"ContainerStarted","Data":"6f72eb4d27b9fdbefaabe144a56e82f49f9d8134d12dd786687412f9ab417641"} Apr 17 17:25:22.219971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:22.219948 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jrddb" event={"ID":"f8a29699-81b9-4195-bf3c-aec014a689f0","Type":"ContainerStarted","Data":"ba808679cd4df33cf66d7966c1355d35f32ea310a3499584ff60e29d5e0a33fc"} Apr 17 17:25:22.221817 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:22.221741 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" event={"ID":"eba52262-b7bd-445f-9a9f-a8688fc5e324","Type":"ContainerStarted","Data":"2d1a6a52f69dcb4b8391a2f1d2203c662f8b6c9a04d154761a37915eeb44dfe0"} Apr 17 17:25:22.221817 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:22.221773 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" event={"ID":"eba52262-b7bd-445f-9a9f-a8688fc5e324","Type":"ContainerStarted","Data":"d36e32b2b1251dcc512ff661d08f27a3db3a7f8a9c65a368479d051b8a00ffb9"} Apr 17 17:25:22.222073 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:22.222052 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:22.244979 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:22.243855 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" podStartSLOduration=2.243838287 podStartE2EDuration="2.243838287s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:25:22.243662267 +0000 UTC m=+62.901706576" watchObservedRunningTime="2026-04-17 17:25:22.243838287 +0000 UTC m=+62.901882597" Apr 17 17:25:23.198867 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:23.198642 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-q2rfs" Apr 17 17:25:23.237326 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:23.232164 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jrddb" event={"ID":"f8a29699-81b9-4195-bf3c-aec014a689f0","Type":"ContainerStarted","Data":"1cc2e14c11ae2af27203d0ecb1afd2b7c09b02c3831586efe55f47471341a154"} Apr 17 17:25:24.575677 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.575640 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t4qtq"] Apr 17 17:25:24.600686 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.600654 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t4qtq"] Apr 17 17:25:24.600864 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.600789 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.603456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.603431 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 17 17:25:24.603594 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.603430 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 17 17:25:24.604751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.604299 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 17 17:25:24.604751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.604348 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 17 17:25:24.604751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.604726 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 17 17:25:24.608223 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.605320 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-tzlkp\"" Apr 17 17:25:24.705474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.705421 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57e3e3e5-be2c-461c-95d8-687cb6527c2f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.705474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.705479 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.705755 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.705513 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.705755 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.705542 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d9xkl\" (UniqueName: \"kubernetes.io/projected/57e3e3e5-be2c-461c-95d8-687cb6527c2f-kube-api-access-d9xkl\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.806011 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.805960 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57e3e3e5-be2c-461c-95d8-687cb6527c2f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.806198 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.806023 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.806198 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.806058 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.806198 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.806100 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d9xkl\" (UniqueName: \"kubernetes.io/projected/57e3e3e5-be2c-461c-95d8-687cb6527c2f-kube-api-access-d9xkl\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.806398 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:24.806235 2565 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 17:25:24.806398 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:24.806336 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls podName:57e3e3e5-be2c-461c-95d8-687cb6527c2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:25.306314919 +0000 UTC m=+65.964359217 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-t4qtq" (UID: "57e3e3e5-be2c-461c-95d8-687cb6527c2f") : secret "prometheus-operator-tls" not found Apr 17 17:25:24.816777 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.816740 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.816971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.816796 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/57e3e3e5-be2c-461c-95d8-687cb6527c2f-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:24.819742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:24.819713 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d9xkl\" (UniqueName: \"kubernetes.io/projected/57e3e3e5-be2c-461c-95d8-687cb6527c2f-kube-api-access-d9xkl\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:25.311096 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.311053 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:25.311325 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:25.311231 2565 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 17 17:25:25.311414 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:25.311333 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls podName:57e3e3e5-be2c-461c-95d8-687cb6527c2f nodeName:}" failed. No retries permitted until 2026-04-17 17:25:26.311310911 +0000 UTC m=+66.969355204 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-t4qtq" (UID: "57e3e3e5-be2c-461c-95d8-687cb6527c2f") : secret "prometheus-operator-tls" not found Apr 17 17:25:25.613456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.613372 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:25:25.615670 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.615643 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 17 17:25:25.626388 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.626362 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa-metrics-certs\") pod \"network-metrics-daemon-sx5fl\" (UID: \"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa\") " pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:25:25.714401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.714361 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:25:25.717083 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.717056 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 17 17:25:25.727202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.727180 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 17 17:25:25.738840 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.738817 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c5dcs\" (UniqueName: \"kubernetes.io/projected/912cdda1-8b21-4e90-bb49-4d95a21c8619-kube-api-access-c5dcs\") pod \"network-check-target-5vptw\" (UID: \"912cdda1-8b21-4e90-bb49-4d95a21c8619\") " pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:25:25.761074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.761039 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-6nwcp\"" Apr 17 17:25:25.763983 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.763955 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-m76bp\"" Apr 17 17:25:25.768773 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.768749 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-sx5fl" Apr 17 17:25:25.772470 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:25.772440 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:25:26.319711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:26.319664 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:26.322327 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:26.322302 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/57e3e3e5-be2c-461c-95d8-687cb6527c2f-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-t4qtq\" (UID: \"57e3e3e5-be2c-461c-95d8-687cb6527c2f\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:26.412977 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:26.412952 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" Apr 17 17:25:26.467104 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:26.467044 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-sx5fl"] Apr 17 17:25:26.471545 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:26.471510 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda7e87bdf_b541_402b_b5a0_91ea1c3a0bfa.slice/crio-8fb004e9d22bc01d3063f02f485fbd9df6dd8a80d9a0ad294364dfe49b6605aa WatchSource:0}: Error finding container 8fb004e9d22bc01d3063f02f485fbd9df6dd8a80d9a0ad294364dfe49b6605aa: Status 404 returned error can't find the container with id 8fb004e9d22bc01d3063f02f485fbd9df6dd8a80d9a0ad294364dfe49b6605aa Apr 17 17:25:26.484140 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:26.483997 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-5vptw"] Apr 17 17:25:26.491345 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:26.491263 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod912cdda1_8b21_4e90_bb49_4d95a21c8619.slice/crio-fd58c63ebe599de179ae6c15dea28948faf2a1a5472948113ae0d236e421b1c8 WatchSource:0}: Error finding container fd58c63ebe599de179ae6c15dea28948faf2a1a5472948113ae0d236e421b1c8: Status 404 returned error can't find the container with id fd58c63ebe599de179ae6c15dea28948faf2a1a5472948113ae0d236e421b1c8 Apr 17 17:25:26.562885 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:26.562805 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-t4qtq"] Apr 17 17:25:26.565442 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:26.565414 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57e3e3e5_be2c_461c_95d8_687cb6527c2f.slice/crio-8f5b965b0a8b72a4d54d6a0fc0f102bbf6cb633a96bfa04ec75c9c6cb6570547 WatchSource:0}: Error finding container 8f5b965b0a8b72a4d54d6a0fc0f102bbf6cb633a96bfa04ec75c9c6cb6570547: Status 404 returned error can't find the container with id 8f5b965b0a8b72a4d54d6a0fc0f102bbf6cb633a96bfa04ec75c9c6cb6570547 Apr 17 17:25:27.245552 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.245513 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" event={"ID":"f5a7faea-7a56-43f6-bd25-8e96fe627a31","Type":"ContainerStarted","Data":"ef14a85fe80ad035e11d0931d82eb2d44de8f89d04f5dd377c81a989aed011c7"} Apr 17 17:25:27.247728 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.247697 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" event={"ID":"ef67b48f-c190-4025-9e62-fb31e9c1b8fa","Type":"ContainerStarted","Data":"10ea947971cb6f69707c699dbe29eacc397e4a4d47a800fc66327e309d7358c0"} Apr 17 17:25:27.247864 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.247850 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:27.249760 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.249726 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" Apr 17 17:25:27.249915 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.249894 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sx5fl" event={"ID":"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa","Type":"ContainerStarted","Data":"8fb004e9d22bc01d3063f02f485fbd9df6dd8a80d9a0ad294364dfe49b6605aa"} Apr 17 17:25:27.251223 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.251196 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" event={"ID":"57e3e3e5-be2c-461c-95d8-687cb6527c2f","Type":"ContainerStarted","Data":"8f5b965b0a8b72a4d54d6a0fc0f102bbf6cb633a96bfa04ec75c9c6cb6570547"} Apr 17 17:25:27.252553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.252512 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5vptw" event={"ID":"912cdda1-8b21-4e90-bb49-4d95a21c8619","Type":"ContainerStarted","Data":"fd58c63ebe599de179ae6c15dea28948faf2a1a5472948113ae0d236e421b1c8"} Apr 17 17:25:27.255718 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.255693 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-jrddb" event={"ID":"f8a29699-81b9-4195-bf3c-aec014a689f0","Type":"ContainerStarted","Data":"2d83110df3a519b0a7881411ef0ca9c65e68956924a5a644f4b710bdbcc93f0b"} Apr 17 17:25:27.261383 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.261222 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-5b4fc94749-gkz5w" podStartSLOduration=2.113194933 podStartE2EDuration="7.261205838s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.176539957 +0000 UTC m=+61.834584243" lastFinishedPulling="2026-04-17 17:25:26.324550856 +0000 UTC m=+66.982595148" observedRunningTime="2026-04-17 17:25:27.260394081 +0000 UTC m=+67.918438391" watchObservedRunningTime="2026-04-17 17:25:27.261205838 +0000 UTC m=+67.919250128" Apr 17 17:25:27.299466 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.299405 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7bfb98874b-9v6mh" podStartSLOduration=2.114955601 podStartE2EDuration="7.299385788s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.157750006 +0000 UTC m=+61.815794295" lastFinishedPulling="2026-04-17 17:25:26.342180183 +0000 UTC m=+67.000224482" observedRunningTime="2026-04-17 17:25:27.280309211 +0000 UTC m=+67.938353520" watchObservedRunningTime="2026-04-17 17:25:27.299385788 +0000 UTC m=+67.957430097" Apr 17 17:25:27.299659 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:27.299609 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-jrddb" podStartSLOduration=2.348141611 podStartE2EDuration="7.299598669s" podCreationTimestamp="2026-04-17 17:25:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:21.373057378 +0000 UTC m=+62.031101670" lastFinishedPulling="2026-04-17 17:25:26.324514442 +0000 UTC m=+66.982558728" observedRunningTime="2026-04-17 17:25:27.298947829 +0000 UTC m=+67.956992143" watchObservedRunningTime="2026-04-17 17:25:27.299598669 +0000 UTC m=+67.957642978" Apr 17 17:25:28.263428 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:28.263296 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sx5fl" event={"ID":"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa","Type":"ContainerStarted","Data":"6d7f265902a0f5bd423ea143d4cb029c06926350ff0a773fa3c22bebad5231aa"} Apr 17 17:25:28.263428 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:28.263347 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-sx5fl" event={"ID":"a7e87bdf-b541-402b-b5a0-91ea1c3a0bfa","Type":"ContainerStarted","Data":"dbd294aa6656ba5fd27e1e39227d77fa106457af24dcb5b4e8fb16bbd2755618"} Apr 17 17:25:28.267400 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:28.267367 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" event={"ID":"57e3e3e5-be2c-461c-95d8-687cb6527c2f","Type":"ContainerStarted","Data":"e6afc981e5e3ba59999a9c0198c1e20fdf4a00a024c5e481044b7bf4085125b8"} Apr 17 17:25:28.267400 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:28.267407 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" event={"ID":"57e3e3e5-be2c-461c-95d8-687cb6527c2f","Type":"ContainerStarted","Data":"b73c05ad2f952b00091e71a621cfd421fc432c28852f039066e1d9c37350eda0"} Apr 17 17:25:28.280229 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:28.280165 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-sx5fl" podStartSLOduration=67.268155287 podStartE2EDuration="1m8.280145853s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:26.474227517 +0000 UTC m=+67.132271808" lastFinishedPulling="2026-04-17 17:25:27.486218087 +0000 UTC m=+68.144262374" observedRunningTime="2026-04-17 17:25:28.278955158 +0000 UTC m=+68.936999477" watchObservedRunningTime="2026-04-17 17:25:28.280145853 +0000 UTC m=+68.938190163" Apr 17 17:25:28.297538 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:28.297482 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-t4qtq" podStartSLOduration=2.86684006 podStartE2EDuration="4.297463247s" podCreationTimestamp="2026-04-17 17:25:24 +0000 UTC" firstStartedPulling="2026-04-17 17:25:26.567337271 +0000 UTC m=+67.225381558" lastFinishedPulling="2026-04-17 17:25:27.997960446 +0000 UTC m=+68.656004745" observedRunningTime="2026-04-17 17:25:28.295722238 +0000 UTC m=+68.953766557" watchObservedRunningTime="2026-04-17 17:25:28.297463247 +0000 UTC m=+68.955507556" Apr 17 17:25:30.042942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.042911 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-gb9wj"] Apr 17 17:25:30.046474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.046454 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.058751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.058525 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 17 17:25:30.058888 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.058800 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 17 17:25:30.059674 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.059558 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-ts64k\"" Apr 17 17:25:30.059674 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.059653 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 17 17:25:30.160056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160007 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160058 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-root\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160094 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-metrics-client-ca\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160126 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-tls\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160171 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-textfile\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160228 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-accelerators-collector-config\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160297 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2mmfq\" (UniqueName: \"kubernetes.io/projected/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-kube-api-access-2mmfq\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160322 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-sys\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.160610 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.160350 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-wtmp\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260712 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260657 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-textfile\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260724 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-accelerators-collector-config\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260746 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2mmfq\" (UniqueName: \"kubernetes.io/projected/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-kube-api-access-2mmfq\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260778 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-sys\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260816 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-wtmp\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260862 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260888 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-root\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.260922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260918 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-metrics-client-ca\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-tls\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.260944 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-sys\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.261030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-root\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.261093 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-textfile\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:30.261175 2565 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:25:30.261235 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-tls podName:768baf92-5c56-4bc1-91c9-eb70f2d76c7b nodeName:}" failed. No retries permitted until 2026-04-17 17:25:30.761216231 +0000 UTC m=+71.419260530 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-tls") pod "node-exporter-gb9wj" (UID: "768baf92-5c56-4bc1-91c9-eb70f2d76c7b") : secret "node-exporter-tls" not found Apr 17 17:25:30.261286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.261231 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-wtmp\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261625 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.261429 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-accelerators-collector-config\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.261625 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.261505 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-metrics-client-ca\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.263315 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.263297 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.270711 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.270680 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2mmfq\" (UniqueName: \"kubernetes.io/projected/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-kube-api-access-2mmfq\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.274901 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.274874 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-5vptw" event={"ID":"912cdda1-8b21-4e90-bb49-4d95a21c8619","Type":"ContainerStarted","Data":"54950e16cc09b86e36921832b946f98ef7789e890db1bcab933fc41b74fe0f04"} Apr 17 17:25:30.275167 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.275144 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-76c557d8ff-j5bm9"] Apr 17 17:25:30.279310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.279291 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:25:30.279415 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.279392 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.281762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.281584 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:25:30.281762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.281617 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p4d47\"" Apr 17 17:25:30.281762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.281634 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:25:30.281995 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.281980 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:25:30.282073 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.282029 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:25:30.282129 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.282075 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:25:30.282129 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.282085 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:25:30.282227 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.282145 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:25:30.291005 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.290986 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c557d8ff-j5bm9"] Apr 17 17:25:30.293457 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.293380 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-5vptw" podStartSLOduration=67.435150086 podStartE2EDuration="1m10.293366283s" podCreationTimestamp="2026-04-17 17:24:20 +0000 UTC" firstStartedPulling="2026-04-17 17:25:26.495948955 +0000 UTC m=+67.153993255" lastFinishedPulling="2026-04-17 17:25:29.354165164 +0000 UTC m=+70.012209452" observedRunningTime="2026-04-17 17:25:30.292597574 +0000 UTC m=+70.950641885" watchObservedRunningTime="2026-04-17 17:25:30.293366283 +0000 UTC m=+70.951410592" Apr 17 17:25:30.361851 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.361817 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-service-ca\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.362025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.361886 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-serving-cert\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.362025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.361909 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-console-config\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.362025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.361930 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vldf5\" (UniqueName: \"kubernetes.io/projected/57b4895e-8c8e-4e99-8783-ba287f387d57-kube-api-access-vldf5\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.362025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.361979 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-oauth-config\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.362025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.362002 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-oauth-serving-cert\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.463277 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.463219 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-oauth-serving-cert\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.463364 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.463310 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-service-ca\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.463419 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.463362 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-serving-cert\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.463419 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.463386 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-console-config\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.463419 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.463413 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vldf5\" (UniqueName: \"kubernetes.io/projected/57b4895e-8c8e-4e99-8783-ba287f387d57-kube-api-access-vldf5\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.463541 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.463454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-oauth-config\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.464071 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.464042 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-oauth-serving-cert\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.464178 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.464057 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-service-ca\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.464178 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.464112 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-console-config\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.465858 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.465833 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-oauth-config\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.465937 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.465858 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-serving-cert\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.471623 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.471592 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vldf5\" (UniqueName: \"kubernetes.io/projected/57b4895e-8c8e-4e99-8783-ba287f387d57-kube-api-access-vldf5\") pod \"console-76c557d8ff-j5bm9\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.588768 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.588681 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:30.715407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.715348 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-76c557d8ff-j5bm9"] Apr 17 17:25:30.727434 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:30.727393 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod57b4895e_8c8e_4e99_8783_ba287f387d57.slice/crio-106014e7a58a710afc961739c7874c842b7cb655286eb2ed4d45bcaf9b34cab1 WatchSource:0}: Error finding container 106014e7a58a710afc961739c7874c842b7cb655286eb2ed4d45bcaf9b34cab1: Status 404 returned error can't find the container with id 106014e7a58a710afc961739c7874c842b7cb655286eb2ed4d45bcaf9b34cab1 Apr 17 17:25:30.766583 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.766547 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-tls\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.768840 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.768810 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/768baf92-5c56-4bc1-91c9-eb70f2d76c7b-node-exporter-tls\") pod \"node-exporter-gb9wj\" (UID: \"768baf92-5c56-4bc1-91c9-eb70f2d76c7b\") " pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.967030 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:30.966996 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-gb9wj" Apr 17 17:25:30.975185 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:30.975146 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod768baf92_5c56_4bc1_91c9_eb70f2d76c7b.slice/crio-32e3f64bcb3dfa8432c2f03e9a9c71a696b262fe0e297b3ac1573ae76ede480a WatchSource:0}: Error finding container 32e3f64bcb3dfa8432c2f03e9a9c71a696b262fe0e297b3ac1573ae76ede480a: Status 404 returned error can't find the container with id 32e3f64bcb3dfa8432c2f03e9a9c71a696b262fe0e297b3ac1573ae76ede480a Apr 17 17:25:31.279051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:31.278962 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gb9wj" event={"ID":"768baf92-5c56-4bc1-91c9-eb70f2d76c7b","Type":"ContainerStarted","Data":"32e3f64bcb3dfa8432c2f03e9a9c71a696b262fe0e297b3ac1573ae76ede480a"} Apr 17 17:25:31.279977 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:31.279954 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c557d8ff-j5bm9" event={"ID":"57b4895e-8c8e-4e99-8783-ba287f387d57","Type":"ContainerStarted","Data":"106014e7a58a710afc961739c7874c842b7cb655286eb2ed4d45bcaf9b34cab1"} Apr 17 17:25:32.284044 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:32.283954 2565 generic.go:358] "Generic (PLEG): container finished" podID="768baf92-5c56-4bc1-91c9-eb70f2d76c7b" containerID="916609dfbf6a55122dd2e6b7de66cd5c18976b1646017b86246cb39a64e8e3a4" exitCode=0 Apr 17 17:25:32.284044 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:32.284016 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gb9wj" event={"ID":"768baf92-5c56-4bc1-91c9-eb70f2d76c7b","Type":"ContainerDied","Data":"916609dfbf6a55122dd2e6b7de66cd5c18976b1646017b86246cb39a64e8e3a4"} Apr 17 17:25:33.289297 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:33.289259 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gb9wj" event={"ID":"768baf92-5c56-4bc1-91c9-eb70f2d76c7b","Type":"ContainerStarted","Data":"82eeb4d52e14a0a160be813bd4b205b6562c310312893ab2bddc226c283a1e5b"} Apr 17 17:25:33.289297 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:33.289301 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-gb9wj" event={"ID":"768baf92-5c56-4bc1-91c9-eb70f2d76c7b","Type":"ContainerStarted","Data":"02f31b38e9ba8c643c3b72497be69ac227f75b5ab97d71771516fb6e5e0331f2"} Apr 17 17:25:33.313019 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:33.312961 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-gb9wj" podStartSLOduration=2.359738703 podStartE2EDuration="3.312938211s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:30.976817668 +0000 UTC m=+71.634861957" lastFinishedPulling="2026-04-17 17:25:31.930017175 +0000 UTC m=+72.588061465" observedRunningTime="2026-04-17 17:25:33.311512335 +0000 UTC m=+73.969556655" watchObservedRunningTime="2026-04-17 17:25:33.312938211 +0000 UTC m=+73.970982520" Apr 17 17:25:34.293557 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.293518 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c557d8ff-j5bm9" event={"ID":"57b4895e-8c8e-4e99-8783-ba287f387d57","Type":"ContainerStarted","Data":"16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67"} Apr 17 17:25:34.314523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.314467 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-76c557d8ff-j5bm9" podStartSLOduration=1.182554861 podStartE2EDuration="4.314453899s" podCreationTimestamp="2026-04-17 17:25:30 +0000 UTC" firstStartedPulling="2026-04-17 17:25:30.73145178 +0000 UTC m=+71.389496068" lastFinishedPulling="2026-04-17 17:25:33.86335082 +0000 UTC m=+74.521395106" observedRunningTime="2026-04-17 17:25:34.31432491 +0000 UTC m=+74.972369229" watchObservedRunningTime="2026-04-17 17:25:34.314453899 +0000 UTC m=+74.972498207" Apr 17 17:25:34.728505 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.728469 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns"] Apr 17 17:25:34.730641 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.730624 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:34.732759 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.732737 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 17 17:25:34.732854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.732803 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-l7c6s\"" Apr 17 17:25:34.741793 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.741771 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns"] Apr 17 17:25:34.798890 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.798851 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/286e6e57-04bf-44ea-87f4-b8e042bdcf20-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7vwns\" (UID: \"286e6e57-04bf-44ea-87f4-b8e042bdcf20\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:34.899774 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.899743 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/286e6e57-04bf-44ea-87f4-b8e042bdcf20-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7vwns\" (UID: \"286e6e57-04bf-44ea-87f4-b8e042bdcf20\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:34.902273 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:34.902232 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/286e6e57-04bf-44ea-87f4-b8e042bdcf20-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-7vwns\" (UID: \"286e6e57-04bf-44ea-87f4-b8e042bdcf20\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:35.039686 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:35.039594 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:35.157887 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:35.157856 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns"] Apr 17 17:25:35.160896 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:25:35.160868 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod286e6e57_04bf_44ea_87f4_b8e042bdcf20.slice/crio-e3ddcfbcb53d36f834fbf3fbdb09ed6842c762188f0d58514d2af81f5a71a041 WatchSource:0}: Error finding container e3ddcfbcb53d36f834fbf3fbdb09ed6842c762188f0d58514d2af81f5a71a041: Status 404 returned error can't find the container with id e3ddcfbcb53d36f834fbf3fbdb09ed6842c762188f0d58514d2af81f5a71a041 Apr 17 17:25:35.297019 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:35.296938 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" event={"ID":"286e6e57-04bf-44ea-87f4-b8e042bdcf20","Type":"ContainerStarted","Data":"e3ddcfbcb53d36f834fbf3fbdb09ed6842c762188f0d58514d2af81f5a71a041"} Apr 17 17:25:37.306191 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:37.306157 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" event={"ID":"286e6e57-04bf-44ea-87f4-b8e042bdcf20","Type":"ContainerStarted","Data":"a78103b545b3f3b825217189291c4cd3b232f060c2d6ed344604380ebae5b961"} Apr 17 17:25:37.306622 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:37.306351 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:37.311021 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:37.310997 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" Apr 17 17:25:37.324458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:37.324416 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-7vwns" podStartSLOduration=1.857891135 podStartE2EDuration="3.324402583s" podCreationTimestamp="2026-04-17 17:25:34 +0000 UTC" firstStartedPulling="2026-04-17 17:25:35.162798189 +0000 UTC m=+75.820842475" lastFinishedPulling="2026-04-17 17:25:36.629309627 +0000 UTC m=+77.287353923" observedRunningTime="2026-04-17 17:25:37.323311855 +0000 UTC m=+77.981356163" watchObservedRunningTime="2026-04-17 17:25:37.324402583 +0000 UTC m=+77.982446890" Apr 17 17:25:40.589590 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:40.589452 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:40.589590 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:40.589505 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:40.594409 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:40.594386 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:41.320182 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:41.320153 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:25:43.236611 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:43.236581 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-5b896476cd-bjzsc" Apr 17 17:25:47.831519 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:25:47.831488 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c557d8ff-j5bm9"] Apr 17 17:26:02.286613 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:02.286577 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-5vptw" Apr 17 17:26:12.850437 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:12.850366 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-76c557d8ff-j5bm9" podUID="57b4895e-8c8e-4e99-8783-ba287f387d57" containerName="console" containerID="cri-o://16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67" gracePeriod=15 Apr 17 17:26:13.075394 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.075371 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c557d8ff-j5bm9_57b4895e-8c8e-4e99-8783-ba287f387d57/console/0.log" Apr 17 17:26:13.075511 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.075444 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:26:13.185173 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185137 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-oauth-config\") pod \"57b4895e-8c8e-4e99-8783-ba287f387d57\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " Apr 17 17:26:13.185173 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185178 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vldf5\" (UniqueName: \"kubernetes.io/projected/57b4895e-8c8e-4e99-8783-ba287f387d57-kube-api-access-vldf5\") pod \"57b4895e-8c8e-4e99-8783-ba287f387d57\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " Apr 17 17:26:13.185402 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185202 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-console-config\") pod \"57b4895e-8c8e-4e99-8783-ba287f387d57\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " Apr 17 17:26:13.185402 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185276 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-service-ca\") pod \"57b4895e-8c8e-4e99-8783-ba287f387d57\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " Apr 17 17:26:13.185402 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185312 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-serving-cert\") pod \"57b4895e-8c8e-4e99-8783-ba287f387d57\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " Apr 17 17:26:13.185402 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185331 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-oauth-serving-cert\") pod \"57b4895e-8c8e-4e99-8783-ba287f387d57\" (UID: \"57b4895e-8c8e-4e99-8783-ba287f387d57\") " Apr 17 17:26:13.185739 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185710 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-console-config" (OuterVolumeSpecName: "console-config") pod "57b4895e-8c8e-4e99-8783-ba287f387d57" (UID: "57b4895e-8c8e-4e99-8783-ba287f387d57"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:13.185828 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185746 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-service-ca" (OuterVolumeSpecName: "service-ca") pod "57b4895e-8c8e-4e99-8783-ba287f387d57" (UID: "57b4895e-8c8e-4e99-8783-ba287f387d57"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:13.185828 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.185765 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "57b4895e-8c8e-4e99-8783-ba287f387d57" (UID: "57b4895e-8c8e-4e99-8783-ba287f387d57"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:26:13.187636 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.187608 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/57b4895e-8c8e-4e99-8783-ba287f387d57-kube-api-access-vldf5" (OuterVolumeSpecName: "kube-api-access-vldf5") pod "57b4895e-8c8e-4e99-8783-ba287f387d57" (UID: "57b4895e-8c8e-4e99-8783-ba287f387d57"). InnerVolumeSpecName "kube-api-access-vldf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:26:13.187636 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.187615 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "57b4895e-8c8e-4e99-8783-ba287f387d57" (UID: "57b4895e-8c8e-4e99-8783-ba287f387d57"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:13.187763 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.187697 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "57b4895e-8c8e-4e99-8783-ba287f387d57" (UID: "57b4895e-8c8e-4e99-8783-ba287f387d57"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:26:13.285815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.285782 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-oauth-config\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:26:13.285815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.285808 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vldf5\" (UniqueName: \"kubernetes.io/projected/57b4895e-8c8e-4e99-8783-ba287f387d57-kube-api-access-vldf5\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:26:13.285815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.285817 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-console-config\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:26:13.285815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.285827 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-service-ca\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:26:13.286062 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.285835 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/57b4895e-8c8e-4e99-8783-ba287f387d57-console-serving-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:26:13.286062 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.285844 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/57b4895e-8c8e-4e99-8783-ba287f387d57-oauth-serving-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:26:13.410874 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.410848 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-76c557d8ff-j5bm9_57b4895e-8c8e-4e99-8783-ba287f387d57/console/0.log" Apr 17 17:26:13.411037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.410887 2565 generic.go:358] "Generic (PLEG): container finished" podID="57b4895e-8c8e-4e99-8783-ba287f387d57" containerID="16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67" exitCode=2 Apr 17 17:26:13.411037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.410937 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c557d8ff-j5bm9" event={"ID":"57b4895e-8c8e-4e99-8783-ba287f387d57","Type":"ContainerDied","Data":"16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67"} Apr 17 17:26:13.411037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.410958 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-76c557d8ff-j5bm9" event={"ID":"57b4895e-8c8e-4e99-8783-ba287f387d57","Type":"ContainerDied","Data":"106014e7a58a710afc961739c7874c842b7cb655286eb2ed4d45bcaf9b34cab1"} Apr 17 17:26:13.411037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.410958 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-76c557d8ff-j5bm9" Apr 17 17:26:13.411037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.410984 2565 scope.go:117] "RemoveContainer" containerID="16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67" Apr 17 17:26:13.419426 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.419404 2565 scope.go:117] "RemoveContainer" containerID="16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67" Apr 17 17:26:13.419667 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:26:13.419647 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67\": container with ID starting with 16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67 not found: ID does not exist" containerID="16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67" Apr 17 17:26:13.419712 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.419676 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67"} err="failed to get container status \"16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67\": rpc error: code = NotFound desc = could not find container \"16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67\": container with ID starting with 16af5b579a9c887c63be7c6c7bea4e2bab71492a8a7fba8498978227a73ffb67 not found: ID does not exist" Apr 17 17:26:13.437081 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.437008 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-76c557d8ff-j5bm9"] Apr 17 17:26:13.447347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.447322 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-76c557d8ff-j5bm9"] Apr 17 17:26:13.951541 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:13.951508 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="57b4895e-8c8e-4e99-8783-ba287f387d57" path="/var/lib/kubelet/pods/57b4895e-8c8e-4e99-8783-ba287f387d57/volumes" Apr 17 17:26:57.612638 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.612599 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-85b699cf6b-7mxtm"] Apr 17 17:26:57.613163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.612965 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="57b4895e-8c8e-4e99-8783-ba287f387d57" containerName="console" Apr 17 17:26:57.613163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.612983 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="57b4895e-8c8e-4e99-8783-ba287f387d57" containerName="console" Apr 17 17:26:57.613163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.613038 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="57b4895e-8c8e-4e99-8783-ba287f387d57" containerName="console" Apr 17 17:26:57.614956 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.614934 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.619688 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.619661 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:26:57.619688 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.619678 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:26:57.620299 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.620275 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:26:57.620401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.620301 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:26:57.620401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.620356 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:26:57.620530 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.620434 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p4d47\"" Apr 17 17:26:57.620595 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.620533 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:26:57.620595 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.620588 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:26:57.623912 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.623883 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:26:57.628939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.628919 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b699cf6b-7mxtm"] Apr 17 17:26:57.707523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707490 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-service-ca\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.707697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707535 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-serving-cert\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.707697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707557 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-oauth-serving-cert\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.707697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707580 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-oauth-config\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.707697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707612 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rtwf\" (UniqueName: \"kubernetes.io/projected/16f78011-f1a7-4eb7-be9b-042004c76870-kube-api-access-8rtwf\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.707697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707631 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-console-config\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.707697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.707677 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-trusted-ca-bundle\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.808897 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.808853 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-oauth-config\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.808897 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.808900 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rtwf\" (UniqueName: \"kubernetes.io/projected/16f78011-f1a7-4eb7-be9b-042004c76870-kube-api-access-8rtwf\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.808925 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-console-config\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.808946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-trusted-ca-bundle\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.808969 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-service-ca\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.809094 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-serving-cert\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.809129 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-oauth-serving-cert\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809804 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.809776 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-service-ca\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.809829 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-oauth-serving-cert\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.809828 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-console-config\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.809929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.809915 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-trusted-ca-bundle\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.811426 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.811398 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-oauth-config\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.811520 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.811493 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-serving-cert\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.817734 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.817703 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rtwf\" (UniqueName: \"kubernetes.io/projected/16f78011-f1a7-4eb7-be9b-042004c76870-kube-api-access-8rtwf\") pod \"console-85b699cf6b-7mxtm\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:57.928989 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:57.928955 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:26:58.048764 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:58.048731 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-85b699cf6b-7mxtm"] Apr 17 17:26:58.051928 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:26:58.051896 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod16f78011_f1a7_4eb7_be9b_042004c76870.slice/crio-ca5ea328125c63aa8a93003fed96f0793ccf719cf40faf38fd509c4e84eb0200 WatchSource:0}: Error finding container ca5ea328125c63aa8a93003fed96f0793ccf719cf40faf38fd509c4e84eb0200: Status 404 returned error can't find the container with id ca5ea328125c63aa8a93003fed96f0793ccf719cf40faf38fd509c4e84eb0200 Apr 17 17:26:58.529028 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:58.528990 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b699cf6b-7mxtm" event={"ID":"16f78011-f1a7-4eb7-be9b-042004c76870","Type":"ContainerStarted","Data":"ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8"} Apr 17 17:26:58.529028 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:58.529026 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b699cf6b-7mxtm" event={"ID":"16f78011-f1a7-4eb7-be9b-042004c76870","Type":"ContainerStarted","Data":"ca5ea328125c63aa8a93003fed96f0793ccf719cf40faf38fd509c4e84eb0200"} Apr 17 17:26:58.546084 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:26:58.545834 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-85b699cf6b-7mxtm" podStartSLOduration=1.545815451 podStartE2EDuration="1.545815451s" podCreationTimestamp="2026-04-17 17:26:57 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:26:58.545511483 +0000 UTC m=+159.203555792" watchObservedRunningTime="2026-04-17 17:26:58.545815451 +0000 UTC m=+159.203859739" Apr 17 17:27:07.929935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:27:07.929895 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:27:07.929935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:27:07.929947 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:27:07.934564 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:27:07.934547 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:27:08.558672 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:27:08.558644 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:28:32.320192 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:32.320153 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85b699cf6b-7mxtm"] Apr 17 17:28:33.623432 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.623398 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-9c4x9"] Apr 17 17:28:33.625638 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.625621 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.629629 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.629612 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 17 17:28:33.635589 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.635567 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9c4x9"] Apr 17 17:28:33.692043 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.692008 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5fd1192d-ea25-4070-ac07-30ae8b3bade1-kubelet-config\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.692206 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.692055 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5fd1192d-ea25-4070-ac07-30ae8b3bade1-dbus\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.692206 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.692094 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fd1192d-ea25-4070-ac07-30ae8b3bade1-original-pull-secret\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.793416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.793374 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fd1192d-ea25-4070-ac07-30ae8b3bade1-original-pull-secret\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.793416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.793422 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5fd1192d-ea25-4070-ac07-30ae8b3bade1-kubelet-config\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.793620 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.793467 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5fd1192d-ea25-4070-ac07-30ae8b3bade1-dbus\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.793620 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.793569 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/5fd1192d-ea25-4070-ac07-30ae8b3bade1-kubelet-config\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.793692 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.793618 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/5fd1192d-ea25-4070-ac07-30ae8b3bade1-dbus\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.795659 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.795639 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/5fd1192d-ea25-4070-ac07-30ae8b3bade1-original-pull-secret\") pod \"global-pull-secret-syncer-9c4x9\" (UID: \"5fd1192d-ea25-4070-ac07-30ae8b3bade1\") " pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:33.934530 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:33.934494 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-9c4x9" Apr 17 17:28:34.050044 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:34.050009 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-9c4x9"] Apr 17 17:28:34.053124 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:28:34.053093 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5fd1192d_ea25_4070_ac07_30ae8b3bade1.slice/crio-44c073ed24593573aa5735e621b1277a2326f655981929f5e013224a59f28c37 WatchSource:0}: Error finding container 44c073ed24593573aa5735e621b1277a2326f655981929f5e013224a59f28c37: Status 404 returned error can't find the container with id 44c073ed24593573aa5735e621b1277a2326f655981929f5e013224a59f28c37 Apr 17 17:28:34.775668 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:34.775630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9c4x9" event={"ID":"5fd1192d-ea25-4070-ac07-30ae8b3bade1","Type":"ContainerStarted","Data":"44c073ed24593573aa5735e621b1277a2326f655981929f5e013224a59f28c37"} Apr 17 17:28:38.788039 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:38.788005 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-9c4x9" event={"ID":"5fd1192d-ea25-4070-ac07-30ae8b3bade1","Type":"ContainerStarted","Data":"90f49cfd381c0384e4b87102bb8e0af90ae7d5ed168d829d2996ea6de1bc6821"} Apr 17 17:28:38.808811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:38.808752 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-9c4x9" podStartSLOduration=1.670234421 podStartE2EDuration="5.808736763s" podCreationTimestamp="2026-04-17 17:28:33 +0000 UTC" firstStartedPulling="2026-04-17 17:28:34.054701441 +0000 UTC m=+254.712745727" lastFinishedPulling="2026-04-17 17:28:38.19320378 +0000 UTC m=+258.851248069" observedRunningTime="2026-04-17 17:28:38.808011529 +0000 UTC m=+259.466055838" watchObservedRunningTime="2026-04-17 17:28:38.808736763 +0000 UTC m=+259.466781070" Apr 17 17:28:57.338634 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.338588 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-85b699cf6b-7mxtm" podUID="16f78011-f1a7-4eb7-be9b-042004c76870" containerName="console" containerID="cri-o://ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8" gracePeriod=15 Apr 17 17:28:57.569480 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.569458 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85b699cf6b-7mxtm_16f78011-f1a7-4eb7-be9b-042004c76870/console/0.log" Apr 17 17:28:57.569594 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.569517 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:28:57.675157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675123 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8rtwf\" (UniqueName: \"kubernetes.io/projected/16f78011-f1a7-4eb7-be9b-042004c76870-kube-api-access-8rtwf\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675176 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-trusted-ca-bundle\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675194 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-console-config\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675213 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-oauth-config\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675483 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675349 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-oauth-serving-cert\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675483 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675406 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-service-ca\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675483 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675439 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-serving-cert\") pod \"16f78011-f1a7-4eb7-be9b-042004c76870\" (UID: \"16f78011-f1a7-4eb7-be9b-042004c76870\") " Apr 17 17:28:57.675681 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675646 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-console-config" (OuterVolumeSpecName: "console-config") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:57.675681 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675666 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:57.675854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675788 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:57.675931 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.675896 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-service-ca" (OuterVolumeSpecName: "service-ca") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:28:57.677557 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.677534 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:57.677630 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.677612 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:28:57.677682 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.677659 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/16f78011-f1a7-4eb7-be9b-042004c76870-kube-api-access-8rtwf" (OuterVolumeSpecName: "kube-api-access-8rtwf") pod "16f78011-f1a7-4eb7-be9b-042004c76870" (UID: "16f78011-f1a7-4eb7-be9b-042004c76870"). InnerVolumeSpecName "kube-api-access-8rtwf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:28:57.776890 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776843 2565 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-oauth-serving-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.776890 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776878 2565 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-service-ca\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.776890 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776892 2565 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-serving-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.777130 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776904 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8rtwf\" (UniqueName: \"kubernetes.io/projected/16f78011-f1a7-4eb7-be9b-042004c76870-kube-api-access-8rtwf\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.777130 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776917 2565 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-trusted-ca-bundle\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.777130 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776929 2565 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/16f78011-f1a7-4eb7-be9b-042004c76870-console-config\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.777130 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.776941 2565 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/16f78011-f1a7-4eb7-be9b-042004c76870-console-oauth-config\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:28:57.837314 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.837289 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-85b699cf6b-7mxtm_16f78011-f1a7-4eb7-be9b-042004c76870/console/0.log" Apr 17 17:28:57.837458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.837326 2565 generic.go:358] "Generic (PLEG): container finished" podID="16f78011-f1a7-4eb7-be9b-042004c76870" containerID="ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8" exitCode=2 Apr 17 17:28:57.837458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.837367 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b699cf6b-7mxtm" event={"ID":"16f78011-f1a7-4eb7-be9b-042004c76870","Type":"ContainerDied","Data":"ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8"} Apr 17 17:28:57.837458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.837389 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-85b699cf6b-7mxtm" event={"ID":"16f78011-f1a7-4eb7-be9b-042004c76870","Type":"ContainerDied","Data":"ca5ea328125c63aa8a93003fed96f0793ccf719cf40faf38fd509c4e84eb0200"} Apr 17 17:28:57.837458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.837403 2565 scope.go:117] "RemoveContainer" containerID="ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8" Apr 17 17:28:57.837458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.837431 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-85b699cf6b-7mxtm" Apr 17 17:28:57.845740 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.845719 2565 scope.go:117] "RemoveContainer" containerID="ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8" Apr 17 17:28:57.846008 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:28:57.845985 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8\": container with ID starting with ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8 not found: ID does not exist" containerID="ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8" Apr 17 17:28:57.846086 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.846022 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8"} err="failed to get container status \"ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8\": rpc error: code = NotFound desc = could not find container \"ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8\": container with ID starting with ffb103fe3e7f5e200478fd5060f435d530fc465a954355dffa42254aa48e10f8 not found: ID does not exist" Apr 17 17:28:57.859176 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.859154 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-85b699cf6b-7mxtm"] Apr 17 17:28:57.864940 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.864919 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-85b699cf6b-7mxtm"] Apr 17 17:28:57.950897 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:28:57.950822 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="16f78011-f1a7-4eb7-be9b-042004c76870" path="/var/lib/kubelet/pods/16f78011-f1a7-4eb7-be9b-042004c76870/volumes" Apr 17 17:29:19.826335 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:19.826298 2565 kubelet.go:1628] "Image garbage collection succeeded" Apr 17 17:29:22.068623 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.068585 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l89db"] Apr 17 17:29:22.070854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.068917 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="16f78011-f1a7-4eb7-be9b-042004c76870" containerName="console" Apr 17 17:29:22.070854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.068930 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="16f78011-f1a7-4eb7-be9b-042004c76870" containerName="console" Apr 17 17:29:22.070854 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.068999 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="16f78011-f1a7-4eb7-be9b-042004c76870" containerName="console" Apr 17 17:29:22.071671 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.071656 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.073981 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.073946 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 17 17:29:22.073981 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.073967 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 17 17:29:22.074162 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.074069 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-sb8rf\"" Apr 17 17:29:22.074222 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.074184 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 17 17:29:22.074293 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.074273 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 17 17:29:22.074778 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.074763 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 17 17:29:22.082672 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.082652 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l89db"] Apr 17 17:29:22.164635 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.164601 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-cabundle0\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.164828 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.164664 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6hh6\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-kube-api-access-x6hh6\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.164828 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.164746 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-certificates\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.265523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.265490 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6hh6\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-kube-api-access-x6hh6\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.265523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.265526 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-certificates\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.265793 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.265550 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-cabundle0\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.265793 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.265653 2565 secret.go:281] references non-existent secret key: ca.crt Apr 17 17:29:22.265793 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.265675 2565 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 17 17:29:22.265793 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.265685 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-l89db: references non-existent secret key: ca.crt Apr 17 17:29:22.265793 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.265734 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-certificates podName:0baf7ea0-dd40-40ce-81e3-0788ddb190c1 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:22.765719591 +0000 UTC m=+303.423763877 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-certificates") pod "keda-operator-ffbb595cb-l89db" (UID: "0baf7ea0-dd40-40ce-81e3-0788ddb190c1") : references non-existent secret key: ca.crt Apr 17 17:29:22.266134 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.266115 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-cabundle0\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.273993 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.273956 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6hh6\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-kube-api-access-x6hh6\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.702456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.702419 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-xjc5j"] Apr 17 17:29:22.705559 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.705541 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:22.708843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.708815 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 17 17:29:22.714292 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.714268 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xjc5j"] Apr 17 17:29:22.769458 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.769425 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-certificates\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.771734 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.771711 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/0baf7ea0-dd40-40ce-81e3-0788ddb190c1-certificates\") pod \"keda-operator-ffbb595cb-l89db\" (UID: \"0baf7ea0-dd40-40ce-81e3-0788ddb190c1\") " pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:22.870137 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.870090 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-certificates\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:22.870366 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.870173 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d25dk\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-kube-api-access-d25dk\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:22.971401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.971307 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d25dk\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-kube-api-access-d25dk\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:22.971401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.971384 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-certificates\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:22.971580 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.971481 2565 projected.go:264] Couldn't get secret openshift-keda/keda-admission-webhooks-certs: secret "keda-admission-webhooks-certs" not found Apr 17 17:29:22.971580 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.971503 2565 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-admission-cf49989db-xjc5j: secret "keda-admission-webhooks-certs" not found Apr 17 17:29:22.971580 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:29:22.971575 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-certificates podName:21357cec-cc86-4475-9b85-713951912386 nodeName:}" failed. No retries permitted until 2026-04-17 17:29:23.471553926 +0000 UTC m=+304.129598212 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-certificates") pod "keda-admission-cf49989db-xjc5j" (UID: "21357cec-cc86-4475-9b85-713951912386") : secret "keda-admission-webhooks-certs" not found Apr 17 17:29:22.979795 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.979765 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d25dk\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-kube-api-access-d25dk\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:22.981347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:22.981303 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:23.126346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.126309 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-l89db"] Apr 17 17:29:23.129678 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:29:23.129652 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0baf7ea0_dd40_40ce_81e3_0788ddb190c1.slice/crio-a189c59de6f308a219c990002efb3472de92263afd9ff43acbbf52c784106e7f WatchSource:0}: Error finding container a189c59de6f308a219c990002efb3472de92263afd9ff43acbbf52c784106e7f: Status 404 returned error can't find the container with id a189c59de6f308a219c990002efb3472de92263afd9ff43acbbf52c784106e7f Apr 17 17:29:23.131090 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.131074 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:29:23.476166 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.476127 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-certificates\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:23.478540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.478515 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/21357cec-cc86-4475-9b85-713951912386-certificates\") pod \"keda-admission-cf49989db-xjc5j\" (UID: \"21357cec-cc86-4475-9b85-713951912386\") " pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:23.617199 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.617156 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:23.731902 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.731827 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-xjc5j"] Apr 17 17:29:23.734478 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:29:23.734448 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod21357cec_cc86_4475_9b85_713951912386.slice/crio-defc10dd4208d206925fdee3f34c788014a9e3ba094a4a94f0036474e82b4f68 WatchSource:0}: Error finding container defc10dd4208d206925fdee3f34c788014a9e3ba094a4a94f0036474e82b4f68: Status 404 returned error can't find the container with id defc10dd4208d206925fdee3f34c788014a9e3ba094a4a94f0036474e82b4f68 Apr 17 17:29:23.908961 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.908922 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xjc5j" event={"ID":"21357cec-cc86-4475-9b85-713951912386","Type":"ContainerStarted","Data":"defc10dd4208d206925fdee3f34c788014a9e3ba094a4a94f0036474e82b4f68"} Apr 17 17:29:23.910017 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:23.909990 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-l89db" event={"ID":"0baf7ea0-dd40-40ce-81e3-0788ddb190c1","Type":"ContainerStarted","Data":"a189c59de6f308a219c990002efb3472de92263afd9ff43acbbf52c784106e7f"} Apr 17 17:29:25.918750 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:25.918665 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-xjc5j" event={"ID":"21357cec-cc86-4475-9b85-713951912386","Type":"ContainerStarted","Data":"a0421e081d8fe56ccf261c14ce0bb71861bb10449ec45ceded7700f894286018"} Apr 17 17:29:25.919189 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:25.918775 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:25.935473 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:25.935413 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-xjc5j" podStartSLOduration=2.019689467 podStartE2EDuration="3.935393792s" podCreationTimestamp="2026-04-17 17:29:22 +0000 UTC" firstStartedPulling="2026-04-17 17:29:23.735777921 +0000 UTC m=+304.393822207" lastFinishedPulling="2026-04-17 17:29:25.651482235 +0000 UTC m=+306.309526532" observedRunningTime="2026-04-17 17:29:25.934649486 +0000 UTC m=+306.592693804" watchObservedRunningTime="2026-04-17 17:29:25.935393792 +0000 UTC m=+306.593438102" Apr 17 17:29:29.930916 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:29.930875 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-l89db" event={"ID":"0baf7ea0-dd40-40ce-81e3-0788ddb190c1","Type":"ContainerStarted","Data":"331a9c826cee2eda1d0f642177dd551622955ec8b24577fed361156b61195cfc"} Apr 17 17:29:29.931301 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:29.931002 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:29:29.956510 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:29.956466 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-l89db" podStartSLOduration=1.449668275 podStartE2EDuration="7.956452329s" podCreationTimestamp="2026-04-17 17:29:22 +0000 UTC" firstStartedPulling="2026-04-17 17:29:23.131215788 +0000 UTC m=+303.789260075" lastFinishedPulling="2026-04-17 17:29:29.637999843 +0000 UTC m=+310.296044129" observedRunningTime="2026-04-17 17:29:29.955405408 +0000 UTC m=+310.613449716" watchObservedRunningTime="2026-04-17 17:29:29.956452329 +0000 UTC m=+310.614496636" Apr 17 17:29:46.923577 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:46.923544 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-xjc5j" Apr 17 17:29:50.936867 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:29:50.936832 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-l89db" Apr 17 17:30:36.053237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.053202 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-s9vz7"] Apr 17 17:30:36.056356 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.056339 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.058609 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.058580 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 17 17:30:36.058609 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.058580 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-nbf79\"" Apr 17 17:30:36.059238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.059223 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 17 17:30:36.065605 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.065583 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-s9vz7"] Apr 17 17:30:36.210366 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.210329 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a059f062-fdc2-4442-8628-a0ce8440c047-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-s9vz7\" (UID: \"a059f062-fdc2-4442-8628-a0ce8440c047\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.210537 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.210387 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-76vkp\" (UniqueName: \"kubernetes.io/projected/a059f062-fdc2-4442-8628-a0ce8440c047-kube-api-access-76vkp\") pod \"cert-manager-webhook-597b96b99b-s9vz7\" (UID: \"a059f062-fdc2-4442-8628-a0ce8440c047\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.311684 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.311596 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a059f062-fdc2-4442-8628-a0ce8440c047-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-s9vz7\" (UID: \"a059f062-fdc2-4442-8628-a0ce8440c047\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.311684 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.311651 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-76vkp\" (UniqueName: \"kubernetes.io/projected/a059f062-fdc2-4442-8628-a0ce8440c047-kube-api-access-76vkp\") pod \"cert-manager-webhook-597b96b99b-s9vz7\" (UID: \"a059f062-fdc2-4442-8628-a0ce8440c047\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.319774 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.319729 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/a059f062-fdc2-4442-8628-a0ce8440c047-bound-sa-token\") pod \"cert-manager-webhook-597b96b99b-s9vz7\" (UID: \"a059f062-fdc2-4442-8628-a0ce8440c047\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.319886 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.319870 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-76vkp\" (UniqueName: \"kubernetes.io/projected/a059f062-fdc2-4442-8628-a0ce8440c047-kube-api-access-76vkp\") pod \"cert-manager-webhook-597b96b99b-s9vz7\" (UID: \"a059f062-fdc2-4442-8628-a0ce8440c047\") " pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.364707 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.364674 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:36.480361 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:36.480184 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-597b96b99b-s9vz7"] Apr 17 17:30:36.483047 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:30:36.483019 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda059f062_fdc2_4442_8628_a0ce8440c047.slice/crio-484035d9f38d2524d5fad49c5cade23d9441fb1c3b02a47888815e332ca88adc WatchSource:0}: Error finding container 484035d9f38d2524d5fad49c5cade23d9441fb1c3b02a47888815e332ca88adc: Status 404 returned error can't find the container with id 484035d9f38d2524d5fad49c5cade23d9441fb1c3b02a47888815e332ca88adc Apr 17 17:30:37.102836 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:37.102806 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" event={"ID":"a059f062-fdc2-4442-8628-a0ce8440c047","Type":"ContainerStarted","Data":"484035d9f38d2524d5fad49c5cade23d9441fb1c3b02a47888815e332ca88adc"} Apr 17 17:30:40.112796 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:40.112749 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" event={"ID":"a059f062-fdc2-4442-8628-a0ce8440c047","Type":"ContainerStarted","Data":"d90c97b398393a57db54ce9dabc2dc845651617a9629dcbd61d01aaf0439f943"} Apr 17 17:30:40.113317 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:40.112886 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:30:40.130557 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:40.130511 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" podStartSLOduration=1.423987383 podStartE2EDuration="4.130496099s" podCreationTimestamp="2026-04-17 17:30:36 +0000 UTC" firstStartedPulling="2026-04-17 17:30:36.485224836 +0000 UTC m=+377.143269121" lastFinishedPulling="2026-04-17 17:30:39.191733551 +0000 UTC m=+379.849777837" observedRunningTime="2026-04-17 17:30:40.129583734 +0000 UTC m=+380.787628058" watchObservedRunningTime="2026-04-17 17:30:40.130496099 +0000 UTC m=+380.788540406" Apr 17 17:30:46.118229 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:30:46.118201 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-597b96b99b-s9vz7" Apr 17 17:31:21.415627 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.415591 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9d94w"] Apr 17 17:31:21.417757 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.417741 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.421601 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.421578 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-svd7n\"" Apr 17 17:31:21.421699 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.421643 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 17 17:31:21.421699 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.421646 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 17 17:31:21.430186 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.430165 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9d94w"] Apr 17 17:31:21.445512 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.445488 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bjc4m\" (UniqueName: \"kubernetes.io/projected/41f84757-fe14-47eb-8a98-2bd309842cf8-kube-api-access-bjc4m\") pod \"servicemesh-operator3-55f49c5f94-9d94w\" (UID: \"41f84757-fe14-47eb-8a98-2bd309842cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.445616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.445534 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/41f84757-fe14-47eb-8a98-2bd309842cf8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9d94w\" (UID: \"41f84757-fe14-47eb-8a98-2bd309842cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.546272 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.546211 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bjc4m\" (UniqueName: \"kubernetes.io/projected/41f84757-fe14-47eb-8a98-2bd309842cf8-kube-api-access-bjc4m\") pod \"servicemesh-operator3-55f49c5f94-9d94w\" (UID: \"41f84757-fe14-47eb-8a98-2bd309842cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.546442 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.546308 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/41f84757-fe14-47eb-8a98-2bd309842cf8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9d94w\" (UID: \"41f84757-fe14-47eb-8a98-2bd309842cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.548863 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.548839 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/41f84757-fe14-47eb-8a98-2bd309842cf8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-9d94w\" (UID: \"41f84757-fe14-47eb-8a98-2bd309842cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.555410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.555381 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bjc4m\" (UniqueName: \"kubernetes.io/projected/41f84757-fe14-47eb-8a98-2bd309842cf8-kube-api-access-bjc4m\") pod \"servicemesh-operator3-55f49c5f94-9d94w\" (UID: \"41f84757-fe14-47eb-8a98-2bd309842cf8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.727043 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.726935 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:21.868439 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:21.868399 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-9d94w"] Apr 17 17:31:21.872292 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:31:21.872265 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41f84757_fe14_47eb_8a98_2bd309842cf8.slice/crio-dc5f2cbb82c0c7a9e2f5b497a58e9da2d71511b8650b16725e9ebba69389c662 WatchSource:0}: Error finding container dc5f2cbb82c0c7a9e2f5b497a58e9da2d71511b8650b16725e9ebba69389c662: Status 404 returned error can't find the container with id dc5f2cbb82c0c7a9e2f5b497a58e9da2d71511b8650b16725e9ebba69389c662 Apr 17 17:31:22.221737 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:22.221699 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" event={"ID":"41f84757-fe14-47eb-8a98-2bd309842cf8","Type":"ContainerStarted","Data":"dc5f2cbb82c0c7a9e2f5b497a58e9da2d71511b8650b16725e9ebba69389c662"} Apr 17 17:31:25.231663 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:25.231621 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" event={"ID":"41f84757-fe14-47eb-8a98-2bd309842cf8","Type":"ContainerStarted","Data":"1903cf6ccf7daa5ccc56b94fcfc4f787aeee96647a0ec0a0ce2dd9d64a8e7ca8"} Apr 17 17:31:25.232043 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:25.231746 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:25.254742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:25.254684 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" podStartSLOduration=1.270273959 podStartE2EDuration="4.254666364s" podCreationTimestamp="2026-04-17 17:31:21 +0000 UTC" firstStartedPulling="2026-04-17 17:31:21.875599204 +0000 UTC m=+422.533643493" lastFinishedPulling="2026-04-17 17:31:24.859991608 +0000 UTC m=+425.518035898" observedRunningTime="2026-04-17 17:31:25.253428071 +0000 UTC m=+425.911472381" watchObservedRunningTime="2026-04-17 17:31:25.254666364 +0000 UTC m=+425.912710673" Apr 17 17:31:36.237899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.237865 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-9d94w" Apr 17 17:31:36.615152 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.615078 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn"] Apr 17 17:31:36.617454 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.617438 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.620038 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620017 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 17 17:31:36.620271 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620233 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 17 17:31:36.620360 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620304 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 17 17:31:36.620360 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620320 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 17 17:31:36.620360 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620328 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-6qxd9\"" Apr 17 17:31:36.620573 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620562 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 17 17:31:36.620658 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.620644 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 17 17:31:36.629630 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.629611 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn"] Apr 17 17:31:36.765051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pg5m4\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-kube-api-access-pg5m4\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.765051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765057 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.765310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765083 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.765310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765179 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c3906be6-cc16-4002-8351-69e1d6c3854e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.765310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765209 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.765310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.765483 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.765330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866136 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866050 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866136 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866104 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c3906be6-cc16-4002-8351-69e1d6c3854e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866136 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866136 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866491 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866159 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866491 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866194 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866491 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866314 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pg5m4\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-kube-api-access-pg5m4\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866491 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866360 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.866945 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.866920 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.868496 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.868475 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c3906be6-cc16-4002-8351-69e1d6c3854e-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.868988 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.868968 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.869046 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.869032 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.869082 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.869032 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.874079 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.874052 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.874440 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.874423 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pg5m4\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-kube-api-access-pg5m4\") pod \"istiod-openshift-gateway-7cd77c7ffd-l8ckn\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:36.926914 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:36.926876 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:37.062610 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:37.062575 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn"] Apr 17 17:31:37.065768 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:31:37.065738 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3906be6_cc16_4002_8351_69e1d6c3854e.slice/crio-9c90fe1b89d5e7db667d16decce126d0ec3a9eec8c444f59d6a1667868f1a3b0 WatchSource:0}: Error finding container 9c90fe1b89d5e7db667d16decce126d0ec3a9eec8c444f59d6a1667868f1a3b0: Status 404 returned error can't find the container with id 9c90fe1b89d5e7db667d16decce126d0ec3a9eec8c444f59d6a1667868f1a3b0 Apr 17 17:31:37.268866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:37.268822 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" event={"ID":"c3906be6-cc16-4002-8351-69e1d6c3854e","Type":"ContainerStarted","Data":"9c90fe1b89d5e7db667d16decce126d0ec3a9eec8c444f59d6a1667868f1a3b0"} Apr 17 17:31:39.984583 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:39.984547 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:31:39.984972 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:39.984616 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:31:40.280212 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:40.280123 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" event={"ID":"c3906be6-cc16-4002-8351-69e1d6c3854e","Type":"ContainerStarted","Data":"f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba"} Apr 17 17:31:40.280212 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:40.280208 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:40.299414 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:40.299345 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" podStartSLOduration=1.382799596 podStartE2EDuration="4.299329386s" podCreationTimestamp="2026-04-17 17:31:36 +0000 UTC" firstStartedPulling="2026-04-17 17:31:37.067817004 +0000 UTC m=+437.725861290" lastFinishedPulling="2026-04-17 17:31:39.984346791 +0000 UTC m=+440.642391080" observedRunningTime="2026-04-17 17:31:40.298517737 +0000 UTC m=+440.956562059" watchObservedRunningTime="2026-04-17 17:31:40.299329386 +0000 UTC m=+440.957373691" Apr 17 17:31:41.285334 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:41.285299 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:31:43.242450 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.242409 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv"] Apr 17 17:31:43.250708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.250671 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.252945 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.252920 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"openshift-ai-inference-openshift-default-dockercfg-xv5jq\"" Apr 17 17:31:43.258069 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.258035 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv"] Apr 17 17:31:43.320048 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320005 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kvq6k\" (UniqueName: \"kubernetes.io/projected/d47dd0f0-1374-487a-a2c1-0251ed74d094-kube-api-access-kvq6k\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320063 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320116 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320141 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320383 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320203 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320383 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320269 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320383 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320297 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320383 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320331 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.320514 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.320372 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d47dd0f0-1374-487a-a2c1-0251ed74d094-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421499 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421454 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421511 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421564 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421592 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d47dd0f0-1374-487a-a2c1-0251ed74d094-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421640 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kvq6k\" (UniqueName: \"kubernetes.io/projected/d47dd0f0-1374-487a-a2c1-0251ed74d094-kube-api-access-kvq6k\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421904 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421690 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421904 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421729 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421904 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421761 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.421904 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421794 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.422095 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421911 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-workload-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.422095 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.421948 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-workload-certs\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.422095 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.422037 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-credential-socket\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.422475 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.422453 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/d47dd0f0-1374-487a-a2c1-0251ed74d094-istiod-ca-cert\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.422638 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.422614 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-data\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.424630 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.424588 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-envoy\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.428749 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.428715 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-podinfo\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.430593 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.430574 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/d47dd0f0-1374-487a-a2c1-0251ed74d094-istio-token\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.430919 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.430894 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kvq6k\" (UniqueName: \"kubernetes.io/projected/d47dd0f0-1374-487a-a2c1-0251ed74d094-kube-api-access-kvq6k\") pod \"openshift-ai-inference-openshift-default-7c5447bb76-trgbv\" (UID: \"d47dd0f0-1374-487a-a2c1-0251ed74d094\") " pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.565965 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.565876 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:43.690813 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:43.690752 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv"] Apr 17 17:31:43.694797 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:31:43.694765 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd47dd0f0_1374_487a_a2c1_0251ed74d094.slice/crio-d6b6fb87fb4913828f593ead34a4d35cd61c3c4ec509da7d8fa99f3e8cb64e9e WatchSource:0}: Error finding container d6b6fb87fb4913828f593ead34a4d35cd61c3c4ec509da7d8fa99f3e8cb64e9e: Status 404 returned error can't find the container with id d6b6fb87fb4913828f593ead34a4d35cd61c3c4ec509da7d8fa99f3e8cb64e9e Apr 17 17:31:44.296214 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:44.296173 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" event={"ID":"d47dd0f0-1374-487a-a2c1-0251ed74d094","Type":"ContainerStarted","Data":"d6b6fb87fb4913828f593ead34a4d35cd61c3c4ec509da7d8fa99f3e8cb64e9e"} Apr 17 17:31:47.106612 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:47.106578 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:31:47.106876 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:47.106654 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:31:47.106876 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:47.106684 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:31:47.308546 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:47.308458 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" event={"ID":"d47dd0f0-1374-487a-a2c1-0251ed74d094","Type":"ContainerStarted","Data":"dd58e5c4880ea9cc3dbb198eb25143f23db89a2c8942913af4e88d3fd96e6a7c"} Apr 17 17:31:47.329286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:47.329212 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" podStartSLOduration=0.919531458 podStartE2EDuration="4.329198243s" podCreationTimestamp="2026-04-17 17:31:43 +0000 UTC" firstStartedPulling="2026-04-17 17:31:43.696660503 +0000 UTC m=+444.354704789" lastFinishedPulling="2026-04-17 17:31:47.106327284 +0000 UTC m=+447.764371574" observedRunningTime="2026-04-17 17:31:47.327567649 +0000 UTC m=+447.985611956" watchObservedRunningTime="2026-04-17 17:31:47.329198243 +0000 UTC m=+447.987242552" Apr 17 17:31:47.566292 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:47.566193 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:48.571171 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:48.571140 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:49.315185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:49.315153 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:49.316301 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:49.316285 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/openshift-ai-inference-openshift-default-7c5447bb76-trgbv" Apr 17 17:31:58.969866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.969833 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-766d7bb8d4-ngwjt"] Apr 17 17:31:58.972179 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.972159 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:58.975349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.975324 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 17 17:31:58.975349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.975338 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 17 17:31:58.975981 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.975959 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 17 17:31:58.976092 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.976017 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 17 17:31:58.976092 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.976074 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 17 17:31:58.976208 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.976089 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 17 17:31:58.976277 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.976215 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-p4d47\"" Apr 17 17:31:58.976344 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.976333 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 17 17:31:58.982252 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.982221 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 17 17:31:58.985299 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:58.985278 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-766d7bb8d4-ngwjt"] Apr 17 17:31:59.060302 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060235 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-serving-cert\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.060302 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060302 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-config\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.060536 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060354 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xz5c8\" (UniqueName: \"kubernetes.io/projected/daaa0496-35db-43d8-80e2-7df51aa0cbe7-kube-api-access-xz5c8\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.060536 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060413 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-oauth-serving-cert\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.060536 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060453 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-oauth-config\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.060536 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-service-ca\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.060536 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.060498 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-trusted-ca-bundle\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.161935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.161880 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-oauth-serving-cert\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.161950 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-oauth-config\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.161994 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-service-ca\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162010 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-trusted-ca-bundle\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-serving-cert\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-config\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162431 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162091 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xz5c8\" (UniqueName: \"kubernetes.io/projected/daaa0496-35db-43d8-80e2-7df51aa0cbe7-kube-api-access-xz5c8\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162741 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162719 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-oauth-serving-cert\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162841 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162818 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-service-ca\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162893 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162871 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-config\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.162942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.162925 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/daaa0496-35db-43d8-80e2-7df51aa0cbe7-trusted-ca-bundle\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.164516 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.164494 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-oauth-config\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.164750 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.164734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/daaa0496-35db-43d8-80e2-7df51aa0cbe7-console-serving-cert\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.170769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.170745 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xz5c8\" (UniqueName: \"kubernetes.io/projected/daaa0496-35db-43d8-80e2-7df51aa0cbe7-kube-api-access-xz5c8\") pod \"console-766d7bb8d4-ngwjt\" (UID: \"daaa0496-35db-43d8-80e2-7df51aa0cbe7\") " pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.281746 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.281651 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:31:59.401072 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:31:59.401011 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-766d7bb8d4-ngwjt"] Apr 17 17:31:59.403596 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:31:59.403570 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddaaa0496_35db_43d8_80e2_7df51aa0cbe7.slice/crio-17b03afeb5111d1ff30f23951d2a5332bed6d5999db0cd0b5a141f63195cd4af WatchSource:0}: Error finding container 17b03afeb5111d1ff30f23951d2a5332bed6d5999db0cd0b5a141f63195cd4af: Status 404 returned error can't find the container with id 17b03afeb5111d1ff30f23951d2a5332bed6d5999db0cd0b5a141f63195cd4af Apr 17 17:32:00.349071 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:00.349033 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766d7bb8d4-ngwjt" event={"ID":"daaa0496-35db-43d8-80e2-7df51aa0cbe7","Type":"ContainerStarted","Data":"ea205f2c21c2bde950e2cd3b262a657b2a61d9176a4b964ebeda8290deebe0f8"} Apr 17 17:32:00.349071 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:00.349067 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-766d7bb8d4-ngwjt" event={"ID":"daaa0496-35db-43d8-80e2-7df51aa0cbe7","Type":"ContainerStarted","Data":"17b03afeb5111d1ff30f23951d2a5332bed6d5999db0cd0b5a141f63195cd4af"} Apr 17 17:32:00.373041 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:00.372996 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-766d7bb8d4-ngwjt" podStartSLOduration=2.3729812519999998 podStartE2EDuration="2.372981252s" podCreationTimestamp="2026-04-17 17:31:58 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:32:00.371536597 +0000 UTC m=+461.029580918" watchObservedRunningTime="2026-04-17 17:32:00.372981252 +0000 UTC m=+461.031025559" Apr 17 17:32:03.415109 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.415062 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-b7fn5"] Apr 17 17:32:03.417339 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.417323 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:03.419721 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.419695 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 17 17:32:03.419848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.419766 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-hj277\"" Apr 17 17:32:03.420210 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.420189 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 17 17:32:03.428791 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.428762 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-b7fn5"] Apr 17 17:32:03.494513 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.494476 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8mr2r\" (UniqueName: \"kubernetes.io/projected/7a57d275-8f91-411b-b982-840a5f8635cf-kube-api-access-8mr2r\") pod \"authorino-operator-7587b89b76-b7fn5\" (UID: \"7a57d275-8f91-411b-b982-840a5f8635cf\") " pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:03.595104 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.595064 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8mr2r\" (UniqueName: \"kubernetes.io/projected/7a57d275-8f91-411b-b982-840a5f8635cf-kube-api-access-8mr2r\") pod \"authorino-operator-7587b89b76-b7fn5\" (UID: \"7a57d275-8f91-411b-b982-840a5f8635cf\") " pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:03.612393 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.612361 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8mr2r\" (UniqueName: \"kubernetes.io/projected/7a57d275-8f91-411b-b982-840a5f8635cf-kube-api-access-8mr2r\") pod \"authorino-operator-7587b89b76-b7fn5\" (UID: \"7a57d275-8f91-411b-b982-840a5f8635cf\") " pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:03.728760 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.728661 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:03.869747 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:03.869671 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-b7fn5"] Apr 17 17:32:03.872580 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:32:03.872546 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7a57d275_8f91_411b_b982_840a5f8635cf.slice/crio-7f363ac22c638cfbfaa37d3706170a09278fe78f623ae119b9a4fa071a135dbf WatchSource:0}: Error finding container 7f363ac22c638cfbfaa37d3706170a09278fe78f623ae119b9a4fa071a135dbf: Status 404 returned error can't find the container with id 7f363ac22c638cfbfaa37d3706170a09278fe78f623ae119b9a4fa071a135dbf Apr 17 17:32:04.363400 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:04.363362 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" event={"ID":"7a57d275-8f91-411b-b982-840a5f8635cf","Type":"ContainerStarted","Data":"7f363ac22c638cfbfaa37d3706170a09278fe78f623ae119b9a4fa071a135dbf"} Apr 17 17:32:06.371662 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:06.371623 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" event={"ID":"7a57d275-8f91-411b-b982-840a5f8635cf","Type":"ContainerStarted","Data":"3757558d7ebd4b5849f0e53aec833518b64bf8f13876a0f74257bf7e3e810f7b"} Apr 17 17:32:06.372076 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:06.371787 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:06.394012 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:06.393956 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" podStartSLOduration=1.002366709 podStartE2EDuration="3.393937764s" podCreationTimestamp="2026-04-17 17:32:03 +0000 UTC" firstStartedPulling="2026-04-17 17:32:03.874719476 +0000 UTC m=+464.532763763" lastFinishedPulling="2026-04-17 17:32:06.266290529 +0000 UTC m=+466.924334818" observedRunningTime="2026-04-17 17:32:06.393505213 +0000 UTC m=+467.051549521" watchObservedRunningTime="2026-04-17 17:32:06.393937764 +0000 UTC m=+467.051982075" Apr 17 17:32:09.006357 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.006320 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z"] Apr 17 17:32:09.009032 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.009011 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:32:09.011655 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.011631 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-kjmmq\"" Apr 17 17:32:09.027126 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.027099 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z"] Apr 17 17:32:09.039477 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.039424 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kpq2m\" (UniqueName: \"kubernetes.io/projected/5369b1e8-120b-4804-977d-218d836aebdb-kube-api-access-kpq2m\") pod \"limitador-operator-controller-manager-c7fb4c8d5-8gp5z\" (UID: \"5369b1e8-120b-4804-977d-218d836aebdb\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:32:09.140160 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.140120 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kpq2m\" (UniqueName: \"kubernetes.io/projected/5369b1e8-120b-4804-977d-218d836aebdb-kube-api-access-kpq2m\") pod \"limitador-operator-controller-manager-c7fb4c8d5-8gp5z\" (UID: \"5369b1e8-120b-4804-977d-218d836aebdb\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:32:09.150705 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.150675 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kpq2m\" (UniqueName: \"kubernetes.io/projected/5369b1e8-120b-4804-977d-218d836aebdb-kube-api-access-kpq2m\") pod \"limitador-operator-controller-manager-c7fb4c8d5-8gp5z\" (UID: \"5369b1e8-120b-4804-977d-218d836aebdb\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:32:09.282026 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.281922 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:32:09.282026 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.281974 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:32:09.286769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.286739 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:32:09.319873 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.319843 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:32:09.395343 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.395314 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-766d7bb8d4-ngwjt" Apr 17 17:32:09.466995 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:09.466952 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z"] Apr 17 17:32:09.470655 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:32:09.470630 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5369b1e8_120b_4804_977d_218d836aebdb.slice/crio-2142216c1d825552a910eb240f5226365352552c4239da2fa6365b4541f9ae07 WatchSource:0}: Error finding container 2142216c1d825552a910eb240f5226365352552c4239da2fa6365b4541f9ae07: Status 404 returned error can't find the container with id 2142216c1d825552a910eb240f5226365352552c4239da2fa6365b4541f9ae07 Apr 17 17:32:10.396320 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:10.396279 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" event={"ID":"5369b1e8-120b-4804-977d-218d836aebdb","Type":"ContainerStarted","Data":"2142216c1d825552a910eb240f5226365352552c4239da2fa6365b4541f9ae07"} Apr 17 17:32:11.401101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:11.401069 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" event={"ID":"5369b1e8-120b-4804-977d-218d836aebdb","Type":"ContainerStarted","Data":"403d80d45e505bff09c91322b5133a2bb697f670c954d462bfd90467c4c67c69"} Apr 17 17:32:11.401560 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:11.401182 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:32:11.421380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:11.421339 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" podStartSLOduration=1.901454904 podStartE2EDuration="3.421325135s" podCreationTimestamp="2026-04-17 17:32:08 +0000 UTC" firstStartedPulling="2026-04-17 17:32:09.4731553 +0000 UTC m=+470.131199586" lastFinishedPulling="2026-04-17 17:32:10.993025529 +0000 UTC m=+471.651069817" observedRunningTime="2026-04-17 17:32:11.420425953 +0000 UTC m=+472.078470268" watchObservedRunningTime="2026-04-17 17:32:11.421325135 +0000 UTC m=+472.079369442" Apr 17 17:32:17.377517 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:17.377485 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-b7fn5" Apr 17 17:32:22.407495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:32:22.407462 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-8gp5z" Apr 17 17:33:02.036216 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.036177 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:02.039590 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.039574 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.041789 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.041769 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 17 17:33:02.041888 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.041821 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-bblll\"" Apr 17 17:33:02.049166 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.049133 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:02.077037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.076996 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5397555-1a13-4039-a92c-e1cb41fa8046-config-file\") pod \"limitador-limitador-64c8f475fb-bd2bb\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.077186 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.077149 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gnqqk\" (UniqueName: \"kubernetes.io/projected/d5397555-1a13-4039-a92c-e1cb41fa8046-kube-api-access-gnqqk\") pod \"limitador-limitador-64c8f475fb-bd2bb\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.129933 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.129898 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:02.178122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.178092 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gnqqk\" (UniqueName: \"kubernetes.io/projected/d5397555-1a13-4039-a92c-e1cb41fa8046-kube-api-access-gnqqk\") pod \"limitador-limitador-64c8f475fb-bd2bb\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.178307 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.178153 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5397555-1a13-4039-a92c-e1cb41fa8046-config-file\") pod \"limitador-limitador-64c8f475fb-bd2bb\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.178726 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.178706 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5397555-1a13-4039-a92c-e1cb41fa8046-config-file\") pod \"limitador-limitador-64c8f475fb-bd2bb\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.186213 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.186184 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gnqqk\" (UniqueName: \"kubernetes.io/projected/d5397555-1a13-4039-a92c-e1cb41fa8046-kube-api-access-gnqqk\") pod \"limitador-limitador-64c8f475fb-bd2bb\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.350175 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.350080 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:02.484001 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.483964 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:02.487893 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:33:02.487862 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd5397555_1a13_4039_a92c_e1cb41fa8046.slice/crio-e39db4d4a8c0237ead1be5f8285fcf8fbf9b1098bca32579dc6ab012c7e2fcd3 WatchSource:0}: Error finding container e39db4d4a8c0237ead1be5f8285fcf8fbf9b1098bca32579dc6ab012c7e2fcd3: Status 404 returned error can't find the container with id e39db4d4a8c0237ead1be5f8285fcf8fbf9b1098bca32579dc6ab012c7e2fcd3 Apr 17 17:33:02.568424 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:02.568386 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" event={"ID":"d5397555-1a13-4039-a92c-e1cb41fa8046","Type":"ContainerStarted","Data":"e39db4d4a8c0237ead1be5f8285fcf8fbf9b1098bca32579dc6ab012c7e2fcd3"} Apr 17 17:33:07.587703 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:07.587663 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" event={"ID":"d5397555-1a13-4039-a92c-e1cb41fa8046","Type":"ContainerStarted","Data":"b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405"} Apr 17 17:33:07.588109 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:07.587829 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:18.592232 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:18.592202 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:18.609340 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:18.609274 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" podStartSLOduration=12.556719997 podStartE2EDuration="16.609254487s" podCreationTimestamp="2026-04-17 17:33:02 +0000 UTC" firstStartedPulling="2026-04-17 17:33:02.49014827 +0000 UTC m=+523.148192559" lastFinishedPulling="2026-04-17 17:33:06.54268276 +0000 UTC m=+527.200727049" observedRunningTime="2026-04-17 17:33:07.606343735 +0000 UTC m=+528.264388044" watchObservedRunningTime="2026-04-17 17:33:18.609254487 +0000 UTC m=+539.267298788" Apr 17 17:33:20.866738 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:20.866705 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:20.867135 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:20.866956 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" podUID="d5397555-1a13-4039-a92c-e1cb41fa8046" containerName="limitador" containerID="cri-o://b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405" gracePeriod=30 Apr 17 17:33:21.409323 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.409297 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:21.537143 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.537046 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5397555-1a13-4039-a92c-e1cb41fa8046-config-file\") pod \"d5397555-1a13-4039-a92c-e1cb41fa8046\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " Apr 17 17:33:21.537143 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.537101 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gnqqk\" (UniqueName: \"kubernetes.io/projected/d5397555-1a13-4039-a92c-e1cb41fa8046-kube-api-access-gnqqk\") pod \"d5397555-1a13-4039-a92c-e1cb41fa8046\" (UID: \"d5397555-1a13-4039-a92c-e1cb41fa8046\") " Apr 17 17:33:21.537478 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.537454 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/d5397555-1a13-4039-a92c-e1cb41fa8046-config-file" (OuterVolumeSpecName: "config-file") pod "d5397555-1a13-4039-a92c-e1cb41fa8046" (UID: "d5397555-1a13-4039-a92c-e1cb41fa8046"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:33:21.539274 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.539236 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d5397555-1a13-4039-a92c-e1cb41fa8046-kube-api-access-gnqqk" (OuterVolumeSpecName: "kube-api-access-gnqqk") pod "d5397555-1a13-4039-a92c-e1cb41fa8046" (UID: "d5397555-1a13-4039-a92c-e1cb41fa8046"). InnerVolumeSpecName "kube-api-access-gnqqk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:33:21.635345 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.635309 2565 generic.go:358] "Generic (PLEG): container finished" podID="d5397555-1a13-4039-a92c-e1cb41fa8046" containerID="b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405" exitCode=0 Apr 17 17:33:21.635485 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.635380 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" Apr 17 17:33:21.635485 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.635401 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" event={"ID":"d5397555-1a13-4039-a92c-e1cb41fa8046","Type":"ContainerDied","Data":"b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405"} Apr 17 17:33:21.635485 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.635440 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-bd2bb" event={"ID":"d5397555-1a13-4039-a92c-e1cb41fa8046","Type":"ContainerDied","Data":"e39db4d4a8c0237ead1be5f8285fcf8fbf9b1098bca32579dc6ab012c7e2fcd3"} Apr 17 17:33:21.635485 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.635460 2565 scope.go:117] "RemoveContainer" containerID="b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405" Apr 17 17:33:21.637742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.637700 2565 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/d5397555-1a13-4039-a92c-e1cb41fa8046-config-file\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:21.637742 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.637729 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gnqqk\" (UniqueName: \"kubernetes.io/projected/d5397555-1a13-4039-a92c-e1cb41fa8046-kube-api-access-gnqqk\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:21.643425 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.643411 2565 scope.go:117] "RemoveContainer" containerID="b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405" Apr 17 17:33:21.643667 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:33:21.643648 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405\": container with ID starting with b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405 not found: ID does not exist" containerID="b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405" Apr 17 17:33:21.643709 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.643674 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405"} err="failed to get container status \"b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405\": rpc error: code = NotFound desc = could not find container \"b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405\": container with ID starting with b85283e03a3ede15d5183f7e5a55e044bb62c93644f563ff1e31bd739e7b7405 not found: ID does not exist" Apr 17 17:33:21.661614 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.661588 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:21.665810 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.665786 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-bd2bb"] Apr 17 17:33:21.952466 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:21.952435 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d5397555-1a13-4039-a92c-e1cb41fa8046" path="/var/lib/kubelet/pods/d5397555-1a13-4039-a92c-e1cb41fa8046/volumes" Apr 17 17:33:40.285507 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.285462 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm"] Apr 17 17:33:40.285971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.285774 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d5397555-1a13-4039-a92c-e1cb41fa8046" containerName="limitador" Apr 17 17:33:40.285971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.285785 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="d5397555-1a13-4039-a92c-e1cb41fa8046" containerName="limitador" Apr 17 17:33:40.285971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.285840 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="d5397555-1a13-4039-a92c-e1cb41fa8046" containerName="limitador" Apr 17 17:33:40.287816 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.287793 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.303963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.303935 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm"] Apr 17 17:33:40.402082 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402038 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.402238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402100 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.402238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402140 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.402238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402172 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.402363 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402232 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.402363 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402330 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5abf0c93-54f4-4998-afdd-68635ef2572c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.402363 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.402356 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4s8pc\" (UniqueName: \"kubernetes.io/projected/5abf0c93-54f4-4998-afdd-68635ef2572c-kube-api-access-4s8pc\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503082 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503131 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5abf0c93-54f4-4998-afdd-68635ef2572c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503160 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4s8pc\" (UniqueName: \"kubernetes.io/projected/5abf0c93-54f4-4998-afdd-68635ef2572c-kube-api-access-4s8pc\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503192 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503263 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503304 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.503602 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.503341 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.504073 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.504046 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.505897 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.505868 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.506121 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.506102 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/5abf0c93-54f4-4998-afdd-68635ef2572c-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.506218 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.506192 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.506295 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.506280 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.517725 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.517699 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/5abf0c93-54f4-4998-afdd-68635ef2572c-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.517862 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.517787 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4s8pc\" (UniqueName: \"kubernetes.io/projected/5abf0c93-54f4-4998-afdd-68635ef2572c-kube-api-access-4s8pc\") pod \"istiod-openshift-gateway-55ff986f96-zwmhm\" (UID: \"5abf0c93-54f4-4998-afdd-68635ef2572c\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.597684 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.597590 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:40.739715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.739672 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm"] Apr 17 17:33:40.745409 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.745367 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:33:40.745509 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:40.745489 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:33:41.704382 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:41.704350 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" event={"ID":"5abf0c93-54f4-4998-afdd-68635ef2572c","Type":"ContainerStarted","Data":"309a2f3463997eca18cb48fde4f5bd4d39caff72a303d996c23766ae0c1a946a"} Apr 17 17:33:41.704382 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:41.704387 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" event={"ID":"5abf0c93-54f4-4998-afdd-68635ef2572c","Type":"ContainerStarted","Data":"e93d9582a947badb29c4eaa9113b80a2c05f703de54e151f9489fb3cfbdd78e1"} Apr 17 17:33:41.704856 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:41.704568 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:41.706081 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:41.706053 2565 patch_prober.go:28] interesting pod/istiod-openshift-gateway-55ff986f96-zwmhm container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 17 17:33:41.706189 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:41.706105 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" podUID="5abf0c93-54f4-4998-afdd-68635ef2572c" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 17 17:33:41.730805 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:41.730749 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" podStartSLOduration=1.730731008 podStartE2EDuration="1.730731008s" podCreationTimestamp="2026-04-17 17:33:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:33:41.728605113 +0000 UTC m=+562.386649423" watchObservedRunningTime="2026-04-17 17:33:41.730731008 +0000 UTC m=+562.388775317" Apr 17 17:33:42.707947 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:42.707923 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-zwmhm" Apr 17 17:33:42.793404 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:42.793366 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn"] Apr 17 17:33:42.793714 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:42.793667 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" podUID="c3906be6-cc16-4002-8351-69e1d6c3854e" containerName="discovery" containerID="cri-o://f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba" gracePeriod=30 Apr 17 17:33:43.037969 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.037946 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:33:43.128159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128129 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-token\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128177 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-cacerts\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128202 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pg5m4\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-kube-api-access-pg5m4\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128219 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-dns-cert\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128283 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c3906be6-cc16-4002-8351-69e1d6c3854e-local-certs\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128297 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-kubeconfig\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128355 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-ca-configmap\") pod \"c3906be6-cc16-4002-8351-69e1d6c3854e\" (UID: \"c3906be6-cc16-4002-8351-69e1d6c3854e\") " Apr 17 17:33:43.128845 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.128813 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 17 17:33:43.130762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.130729 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-token" (OuterVolumeSpecName: "istio-token") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:33:43.130881 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.130861 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-kube-api-access-pg5m4" (OuterVolumeSpecName: "kube-api-access-pg5m4") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "kube-api-access-pg5m4". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:33:43.130930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.130870 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:33:43.130930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.130871 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-cacerts" (OuterVolumeSpecName: "cacerts") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:33:43.130930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.130908 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:33:43.131021 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.130915 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c3906be6-cc16-4002-8351-69e1d6c3854e-local-certs" (OuterVolumeSpecName: "local-certs") pod "c3906be6-cc16-4002-8351-69e1d6c3854e" (UID: "c3906be6-cc16-4002-8351-69e1d6c3854e"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229586 2565 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/c3906be6-cc16-4002-8351-69e1d6c3854e-local-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229617 2565 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-kubeconfig\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229627 2565 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-ca-configmap\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229636 2565 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-token\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229647 2565 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-cacerts\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229655 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pg5m4\" (UniqueName: \"kubernetes.io/projected/c3906be6-cc16-4002-8351-69e1d6c3854e-kube-api-access-pg5m4\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.229676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.229663 2565 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/c3906be6-cc16-4002-8351-69e1d6c3854e-istio-csr-dns-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:33:43.711462 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.711421 2565 generic.go:358] "Generic (PLEG): container finished" podID="c3906be6-cc16-4002-8351-69e1d6c3854e" containerID="f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba" exitCode=0 Apr 17 17:33:43.711895 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.711481 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" Apr 17 17:33:43.711895 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.711507 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" event={"ID":"c3906be6-cc16-4002-8351-69e1d6c3854e","Type":"ContainerDied","Data":"f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba"} Apr 17 17:33:43.711895 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.711539 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn" event={"ID":"c3906be6-cc16-4002-8351-69e1d6c3854e","Type":"ContainerDied","Data":"9c90fe1b89d5e7db667d16decce126d0ec3a9eec8c444f59d6a1667868f1a3b0"} Apr 17 17:33:43.711895 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.711559 2565 scope.go:117] "RemoveContainer" containerID="f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba" Apr 17 17:33:43.720411 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.720390 2565 scope.go:117] "RemoveContainer" containerID="f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba" Apr 17 17:33:43.720681 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:33:43.720656 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba\": container with ID starting with f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba not found: ID does not exist" containerID="f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba" Apr 17 17:33:43.720750 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.720695 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba"} err="failed to get container status \"f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba\": rpc error: code = NotFound desc = could not find container \"f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba\": container with ID starting with f1cf3246e5c43504f5a32bc82367ffba782a3e21ff39360bf9bfa432403a85ba not found: ID does not exist" Apr 17 17:33:43.760565 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.760533 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn"] Apr 17 17:33:43.766995 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.766967 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-l8ckn"] Apr 17 17:33:43.952350 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:43.952321 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c3906be6-cc16-4002-8351-69e1d6c3854e" path="/var/lib/kubelet/pods/c3906be6-cc16-4002-8351-69e1d6c3854e/volumes" Apr 17 17:33:49.547883 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.547840 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mc6bv"] Apr 17 17:33:49.548474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.548305 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c3906be6-cc16-4002-8351-69e1d6c3854e" containerName="discovery" Apr 17 17:33:49.548474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.548324 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c3906be6-cc16-4002-8351-69e1d6c3854e" containerName="discovery" Apr 17 17:33:49.548474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.548406 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c3906be6-cc16-4002-8351-69e1d6c3854e" containerName="discovery" Apr 17 17:33:49.552751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.552728 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.556045 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.556012 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 17 17:33:49.556204 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.556061 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 17 17:33:49.556403 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.556357 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 17 17:33:49.557197 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.557177 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xphw7\"" Apr 17 17:33:49.565301 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.565272 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mc6bv"] Apr 17 17:33:49.568359 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.568337 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv"] Apr 17 17:33:49.571580 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.571558 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.573908 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.573887 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-drg5n\"" Apr 17 17:33:49.574131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.574114 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 17 17:33:49.585715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.585692 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv"] Apr 17 17:33:49.679205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.679156 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lzhrq\" (UniqueName: \"kubernetes.io/projected/697207a3-3f07-49b4-ac35-aa189e7e1614-kube-api-access-lzhrq\") pod \"kserve-controller-manager-85dd7cfb4d-mc6bv\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.679205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.679199 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c81ffb9f-ffd0-4087-822a-614a206ea1fb-cert\") pod \"llmisvc-controller-manager-9d4bc98dc-8kmrv\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.679461 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.679298 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/697207a3-3f07-49b4-ac35-aa189e7e1614-cert\") pod \"kserve-controller-manager-85dd7cfb4d-mc6bv\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.679461 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.679353 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xb5ss\" (UniqueName: \"kubernetes.io/projected/c81ffb9f-ffd0-4087-822a-614a206ea1fb-kube-api-access-xb5ss\") pod \"llmisvc-controller-manager-9d4bc98dc-8kmrv\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.780776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.780735 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lzhrq\" (UniqueName: \"kubernetes.io/projected/697207a3-3f07-49b4-ac35-aa189e7e1614-kube-api-access-lzhrq\") pod \"kserve-controller-manager-85dd7cfb4d-mc6bv\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.780776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.780789 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c81ffb9f-ffd0-4087-822a-614a206ea1fb-cert\") pod \"llmisvc-controller-manager-9d4bc98dc-8kmrv\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.780989 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.780819 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/697207a3-3f07-49b4-ac35-aa189e7e1614-cert\") pod \"kserve-controller-manager-85dd7cfb4d-mc6bv\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.780989 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.780868 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xb5ss\" (UniqueName: \"kubernetes.io/projected/c81ffb9f-ffd0-4087-822a-614a206ea1fb-kube-api-access-xb5ss\") pod \"llmisvc-controller-manager-9d4bc98dc-8kmrv\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.783542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.783516 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/697207a3-3f07-49b4-ac35-aa189e7e1614-cert\") pod \"kserve-controller-manager-85dd7cfb4d-mc6bv\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.783838 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.783821 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c81ffb9f-ffd0-4087-822a-614a206ea1fb-cert\") pod \"llmisvc-controller-manager-9d4bc98dc-8kmrv\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.789151 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.789127 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xb5ss\" (UniqueName: \"kubernetes.io/projected/c81ffb9f-ffd0-4087-822a-614a206ea1fb-kube-api-access-xb5ss\") pod \"llmisvc-controller-manager-9d4bc98dc-8kmrv\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:49.789276 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.789208 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lzhrq\" (UniqueName: \"kubernetes.io/projected/697207a3-3f07-49b4-ac35-aa189e7e1614-kube-api-access-lzhrq\") pod \"kserve-controller-manager-85dd7cfb4d-mc6bv\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.864038 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.863952 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:49.884918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:49.884889 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:50.004567 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:50.004518 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mc6bv"] Apr 17 17:33:50.008074 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:33:50.008035 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod697207a3_3f07_49b4_ac35_aa189e7e1614.slice/crio-4166505321b3b4d74543206f9211a1d5b0ff60ee8a418d772fafe32bec972930 WatchSource:0}: Error finding container 4166505321b3b4d74543206f9211a1d5b0ff60ee8a418d772fafe32bec972930: Status 404 returned error can't find the container with id 4166505321b3b4d74543206f9211a1d5b0ff60ee8a418d772fafe32bec972930 Apr 17 17:33:50.031452 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:50.031423 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv"] Apr 17 17:33:50.034760 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:33:50.034734 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc81ffb9f_ffd0_4087_822a_614a206ea1fb.slice/crio-c11a58f4ee77ce1e642acffe018f6224ed2d88d09a07a109719daebc47314f2b WatchSource:0}: Error finding container c11a58f4ee77ce1e642acffe018f6224ed2d88d09a07a109719daebc47314f2b: Status 404 returned error can't find the container with id c11a58f4ee77ce1e642acffe018f6224ed2d88d09a07a109719daebc47314f2b Apr 17 17:33:50.741335 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:50.741284 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" event={"ID":"697207a3-3f07-49b4-ac35-aa189e7e1614","Type":"ContainerStarted","Data":"4166505321b3b4d74543206f9211a1d5b0ff60ee8a418d772fafe32bec972930"} Apr 17 17:33:50.742506 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:50.742483 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" event={"ID":"c81ffb9f-ffd0-4087-822a-614a206ea1fb","Type":"ContainerStarted","Data":"c11a58f4ee77ce1e642acffe018f6224ed2d88d09a07a109719daebc47314f2b"} Apr 17 17:33:53.754674 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:53.754579 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" event={"ID":"c81ffb9f-ffd0-4087-822a-614a206ea1fb","Type":"ContainerStarted","Data":"62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9"} Apr 17 17:33:53.754674 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:53.754647 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:33:53.755852 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:53.755825 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" event={"ID":"697207a3-3f07-49b4-ac35-aa189e7e1614","Type":"ContainerStarted","Data":"c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287"} Apr 17 17:33:53.755971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:53.755960 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:33:53.771494 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:53.771452 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" podStartSLOduration=1.287933365 podStartE2EDuration="4.771439331s" podCreationTimestamp="2026-04-17 17:33:49 +0000 UTC" firstStartedPulling="2026-04-17 17:33:50.036062614 +0000 UTC m=+570.694106903" lastFinishedPulling="2026-04-17 17:33:53.519568568 +0000 UTC m=+574.177612869" observedRunningTime="2026-04-17 17:33:53.769871151 +0000 UTC m=+574.427915457" watchObservedRunningTime="2026-04-17 17:33:53.771439331 +0000 UTC m=+574.429483639" Apr 17 17:33:53.785164 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:33:53.785123 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" podStartSLOduration=1.930364653 podStartE2EDuration="4.785110006s" podCreationTimestamp="2026-04-17 17:33:49 +0000 UTC" firstStartedPulling="2026-04-17 17:33:50.009425877 +0000 UTC m=+570.667470163" lastFinishedPulling="2026-04-17 17:33:52.864171213 +0000 UTC m=+573.522215516" observedRunningTime="2026-04-17 17:33:53.783793395 +0000 UTC m=+574.441837715" watchObservedRunningTime="2026-04-17 17:33:53.785110006 +0000 UTC m=+574.443154369" Apr 17 17:34:24.762158 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:24.762121 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:34:24.765349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:24.765329 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:34:25.994348 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:25.994312 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mc6bv"] Apr 17 17:34:25.994820 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:25.994524 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" podUID="697207a3-3f07-49b4-ac35-aa189e7e1614" containerName="manager" containerID="cri-o://c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287" gracePeriod=10 Apr 17 17:34:26.019190 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.019161 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rn92v"] Apr 17 17:34:26.074520 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.074494 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rn92v"] Apr 17 17:34:26.074621 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.074611 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.178347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.178310 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33b90b00-ed45-4e02-bc2c-6f2abe04274f-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rn92v\" (UID: \"33b90b00-ed45-4e02-bc2c-6f2abe04274f\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.178499 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.178367 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cvf7r\" (UniqueName: \"kubernetes.io/projected/33b90b00-ed45-4e02-bc2c-6f2abe04274f-kube-api-access-cvf7r\") pod \"kserve-controller-manager-85dd7cfb4d-rn92v\" (UID: \"33b90b00-ed45-4e02-bc2c-6f2abe04274f\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.256045 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.256021 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:34:26.278873 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.278842 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33b90b00-ed45-4e02-bc2c-6f2abe04274f-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rn92v\" (UID: \"33b90b00-ed45-4e02-bc2c-6f2abe04274f\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.279034 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.278891 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cvf7r\" (UniqueName: \"kubernetes.io/projected/33b90b00-ed45-4e02-bc2c-6f2abe04274f-kube-api-access-cvf7r\") pod \"kserve-controller-manager-85dd7cfb4d-rn92v\" (UID: \"33b90b00-ed45-4e02-bc2c-6f2abe04274f\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.281473 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.281445 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/33b90b00-ed45-4e02-bc2c-6f2abe04274f-cert\") pod \"kserve-controller-manager-85dd7cfb4d-rn92v\" (UID: \"33b90b00-ed45-4e02-bc2c-6f2abe04274f\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.291261 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.291216 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cvf7r\" (UniqueName: \"kubernetes.io/projected/33b90b00-ed45-4e02-bc2c-6f2abe04274f-kube-api-access-cvf7r\") pod \"kserve-controller-manager-85dd7cfb4d-rn92v\" (UID: \"33b90b00-ed45-4e02-bc2c-6f2abe04274f\") " pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.379222 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.379188 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/697207a3-3f07-49b4-ac35-aa189e7e1614-cert\") pod \"697207a3-3f07-49b4-ac35-aa189e7e1614\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " Apr 17 17:34:26.379222 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.379227 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lzhrq\" (UniqueName: \"kubernetes.io/projected/697207a3-3f07-49b4-ac35-aa189e7e1614-kube-api-access-lzhrq\") pod \"697207a3-3f07-49b4-ac35-aa189e7e1614\" (UID: \"697207a3-3f07-49b4-ac35-aa189e7e1614\") " Apr 17 17:34:26.381417 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.381387 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/697207a3-3f07-49b4-ac35-aa189e7e1614-cert" (OuterVolumeSpecName: "cert") pod "697207a3-3f07-49b4-ac35-aa189e7e1614" (UID: "697207a3-3f07-49b4-ac35-aa189e7e1614"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:34:26.381525 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.381477 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/697207a3-3f07-49b4-ac35-aa189e7e1614-kube-api-access-lzhrq" (OuterVolumeSpecName: "kube-api-access-lzhrq") pod "697207a3-3f07-49b4-ac35-aa189e7e1614" (UID: "697207a3-3f07-49b4-ac35-aa189e7e1614"). InnerVolumeSpecName "kube-api-access-lzhrq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:34:26.440587 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.440547 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:26.480639 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.480597 2565 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/697207a3-3f07-49b4-ac35-aa189e7e1614-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:34:26.480639 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.480630 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lzhrq\" (UniqueName: \"kubernetes.io/projected/697207a3-3f07-49b4-ac35-aa189e7e1614-kube-api-access-lzhrq\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:34:26.560641 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.560607 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-rn92v"] Apr 17 17:34:26.563747 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:34:26.563716 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33b90b00_ed45_4e02_bc2c_6f2abe04274f.slice/crio-991e9f1d0ac6c764b0a66cad768f9a90d564a75a453fe10129a50637324db983 WatchSource:0}: Error finding container 991e9f1d0ac6c764b0a66cad768f9a90d564a75a453fe10129a50637324db983: Status 404 returned error can't find the container with id 991e9f1d0ac6c764b0a66cad768f9a90d564a75a453fe10129a50637324db983 Apr 17 17:34:26.565059 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.565042 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:34:26.867193 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.867106 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" event={"ID":"33b90b00-ed45-4e02-bc2c-6f2abe04274f","Type":"ContainerStarted","Data":"991e9f1d0ac6c764b0a66cad768f9a90d564a75a453fe10129a50637324db983"} Apr 17 17:34:26.868215 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.868190 2565 generic.go:358] "Generic (PLEG): container finished" podID="697207a3-3f07-49b4-ac35-aa189e7e1614" containerID="c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287" exitCode=0 Apr 17 17:34:26.868354 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.868276 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" event={"ID":"697207a3-3f07-49b4-ac35-aa189e7e1614","Type":"ContainerDied","Data":"c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287"} Apr 17 17:34:26.868354 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.868312 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" Apr 17 17:34:26.868354 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.868327 2565 scope.go:117] "RemoveContainer" containerID="c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287" Apr 17 17:34:26.868495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.868315 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-mc6bv" event={"ID":"697207a3-3f07-49b4-ac35-aa189e7e1614","Type":"ContainerDied","Data":"4166505321b3b4d74543206f9211a1d5b0ff60ee8a418d772fafe32bec972930"} Apr 17 17:34:26.876767 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.876745 2565 scope.go:117] "RemoveContainer" containerID="c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287" Apr 17 17:34:26.877096 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:34:26.877066 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287\": container with ID starting with c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287 not found: ID does not exist" containerID="c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287" Apr 17 17:34:26.877199 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.877103 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287"} err="failed to get container status \"c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287\": rpc error: code = NotFound desc = could not find container \"c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287\": container with ID starting with c11ed88beb4a712290216c709100aeafc48a48e384cd91121d8ceba43894a287 not found: ID does not exist" Apr 17 17:34:26.889933 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.889912 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mc6bv"] Apr 17 17:34:26.895311 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:26.895291 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-85dd7cfb4d-mc6bv"] Apr 17 17:34:27.873257 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:27.873215 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" event={"ID":"33b90b00-ed45-4e02-bc2c-6f2abe04274f","Type":"ContainerStarted","Data":"c85d743e4655b8d81f20d1375d7287bba7e5f274ee2dc38c5f8eb374b583bc05"} Apr 17 17:34:27.873644 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:27.873396 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:27.890767 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:27.890718 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" podStartSLOduration=1.398751668 podStartE2EDuration="1.890704466s" podCreationTimestamp="2026-04-17 17:34:26 +0000 UTC" firstStartedPulling="2026-04-17 17:34:26.565161132 +0000 UTC m=+607.223205418" lastFinishedPulling="2026-04-17 17:34:27.057113927 +0000 UTC m=+607.715158216" observedRunningTime="2026-04-17 17:34:27.888716982 +0000 UTC m=+608.546761290" watchObservedRunningTime="2026-04-17 17:34:27.890704466 +0000 UTC m=+608.548748802" Apr 17 17:34:27.952055 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:27.952024 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="697207a3-3f07-49b4-ac35-aa189e7e1614" path="/var/lib/kubelet/pods/697207a3-3f07-49b4-ac35-aa189e7e1614/volumes" Apr 17 17:34:58.882304 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:58.882275 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-85dd7cfb4d-rn92v" Apr 17 17:34:59.760035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.759997 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-tzhl6"] Apr 17 17:34:59.760510 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.760485 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="697207a3-3f07-49b4-ac35-aa189e7e1614" containerName="manager" Apr 17 17:34:59.760510 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.760510 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="697207a3-3f07-49b4-ac35-aa189e7e1614" containerName="manager" Apr 17 17:34:59.760709 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.760598 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="697207a3-3f07-49b4-ac35-aa189e7e1614" containerName="manager" Apr 17 17:34:59.763788 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.763769 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.768066 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.768039 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-9f8z9\"" Apr 17 17:34:59.768435 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.768286 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 17 17:34:59.778178 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.778155 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tzhl6"] Apr 17 17:34:59.780712 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.780692 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-jnwcl"] Apr 17 17:34:59.783973 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.783953 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:34:59.786124 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.785981 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-bhdbn\"" Apr 17 17:34:59.786124 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.786013 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 17 17:34:59.796147 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.796123 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-jnwcl"] Apr 17 17:34:59.869516 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.869477 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n8w4q\" (UniqueName: \"kubernetes.io/projected/8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2-kube-api-access-n8w4q\") pod \"model-serving-api-86f7b4b499-tzhl6\" (UID: \"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2\") " pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.869719 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.869524 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2-tls-certs\") pod \"model-serving-api-86f7b4b499-tzhl6\" (UID: \"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2\") " pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.869719 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.869625 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wsbcj\" (UniqueName: \"kubernetes.io/projected/8f4284f1-4844-4c8b-872b-f59118b23f2a-kube-api-access-wsbcj\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:34:59.869719 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.869676 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f4284f1-4844-4c8b-872b-f59118b23f2a-cert\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:34:59.970897 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.970863 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wsbcj\" (UniqueName: \"kubernetes.io/projected/8f4284f1-4844-4c8b-872b-f59118b23f2a-kube-api-access-wsbcj\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:34:59.971375 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.970918 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f4284f1-4844-4c8b-872b-f59118b23f2a-cert\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:34:59.971375 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.970990 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n8w4q\" (UniqueName: \"kubernetes.io/projected/8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2-kube-api-access-n8w4q\") pod \"model-serving-api-86f7b4b499-tzhl6\" (UID: \"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2\") " pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.971375 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.971020 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2-tls-certs\") pod \"model-serving-api-86f7b4b499-tzhl6\" (UID: \"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2\") " pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.971375 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:34:59.971110 2565 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 17 17:34:59.971375 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:34:59.971192 2565 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8f4284f1-4844-4c8b-872b-f59118b23f2a-cert podName:8f4284f1-4844-4c8b-872b-f59118b23f2a nodeName:}" failed. No retries permitted until 2026-04-17 17:35:00.471168264 +0000 UTC m=+641.129212567 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/8f4284f1-4844-4c8b-872b-f59118b23f2a-cert") pod "odh-model-controller-696fc77849-jnwcl" (UID: "8f4284f1-4844-4c8b-872b-f59118b23f2a") : secret "odh-model-controller-webhook-cert" not found Apr 17 17:34:59.973489 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.973465 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2-tls-certs\") pod \"model-serving-api-86f7b4b499-tzhl6\" (UID: \"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2\") " pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.982362 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.982339 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n8w4q\" (UniqueName: \"kubernetes.io/projected/8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2-kube-api-access-n8w4q\") pod \"model-serving-api-86f7b4b499-tzhl6\" (UID: \"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2\") " pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:34:59.982477 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:34:59.982375 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wsbcj\" (UniqueName: \"kubernetes.io/projected/8f4284f1-4844-4c8b-872b-f59118b23f2a-kube-api-access-wsbcj\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:35:00.078081 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.078002 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:35:00.200054 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.200028 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-tzhl6"] Apr 17 17:35:00.202363 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:35:00.202334 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e84e5ce_7e2f_40d3_9055_3d12d32d8ff2.slice/crio-fe4d2dd982f0f84c53da85d899f9492d1b971155ecc4e4209e50a7b36403758b WatchSource:0}: Error finding container fe4d2dd982f0f84c53da85d899f9492d1b971155ecc4e4209e50a7b36403758b: Status 404 returned error can't find the container with id fe4d2dd982f0f84c53da85d899f9492d1b971155ecc4e4209e50a7b36403758b Apr 17 17:35:00.475472 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.475435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f4284f1-4844-4c8b-872b-f59118b23f2a-cert\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:35:00.477824 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.477797 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/8f4284f1-4844-4c8b-872b-f59118b23f2a-cert\") pod \"odh-model-controller-696fc77849-jnwcl\" (UID: \"8f4284f1-4844-4c8b-872b-f59118b23f2a\") " pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:35:00.697736 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.697696 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:35:00.852758 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.852662 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-jnwcl"] Apr 17 17:35:00.905894 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:35:00.905856 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8f4284f1_4844_4c8b_872b_f59118b23f2a.slice/crio-4c826f8b8ef11a789bba94602f040cdb7e85a3cf37bdb64a87eb8bb4e63b66e0 WatchSource:0}: Error finding container 4c826f8b8ef11a789bba94602f040cdb7e85a3cf37bdb64a87eb8bb4e63b66e0: Status 404 returned error can't find the container with id 4c826f8b8ef11a789bba94602f040cdb7e85a3cf37bdb64a87eb8bb4e63b66e0 Apr 17 17:35:00.982528 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.982470 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-jnwcl" event={"ID":"8f4284f1-4844-4c8b-872b-f59118b23f2a","Type":"ContainerStarted","Data":"4c826f8b8ef11a789bba94602f040cdb7e85a3cf37bdb64a87eb8bb4e63b66e0"} Apr 17 17:35:00.983683 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:00.983655 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tzhl6" event={"ID":"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2","Type":"ContainerStarted","Data":"fe4d2dd982f0f84c53da85d899f9492d1b971155ecc4e4209e50a7b36403758b"} Apr 17 17:35:01.989991 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:01.989916 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-tzhl6" event={"ID":"8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2","Type":"ContainerStarted","Data":"0dae56ee4b365eefded1206b87f30c367739ca099c74369abe93b86731d46fdb"} Apr 17 17:35:01.990450 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:01.990133 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:35:02.007955 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:02.007686 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-tzhl6" podStartSLOduration=1.713883544 podStartE2EDuration="3.00766938s" podCreationTimestamp="2026-04-17 17:34:59 +0000 UTC" firstStartedPulling="2026-04-17 17:35:00.204134643 +0000 UTC m=+640.862178932" lastFinishedPulling="2026-04-17 17:35:01.497920477 +0000 UTC m=+642.155964768" observedRunningTime="2026-04-17 17:35:02.006784057 +0000 UTC m=+642.664828366" watchObservedRunningTime="2026-04-17 17:35:02.00766938 +0000 UTC m=+642.665713695" Apr 17 17:35:03.998764 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:03.998719 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-jnwcl" event={"ID":"8f4284f1-4844-4c8b-872b-f59118b23f2a","Type":"ContainerStarted","Data":"1318309a198416b4d01caa3658988b6e7e0dedf4bef78ab2a4f5c22f0cf17fdb"} Apr 17 17:35:03.999238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:03.998822 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:35:04.017772 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:04.017719 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-jnwcl" podStartSLOduration=2.331477357 podStartE2EDuration="5.017705391s" podCreationTimestamp="2026-04-17 17:34:59 +0000 UTC" firstStartedPulling="2026-04-17 17:35:00.90741224 +0000 UTC m=+641.565456526" lastFinishedPulling="2026-04-17 17:35:03.593640274 +0000 UTC m=+644.251684560" observedRunningTime="2026-04-17 17:35:04.015193602 +0000 UTC m=+644.673237922" watchObservedRunningTime="2026-04-17 17:35:04.017705391 +0000 UTC m=+644.675749698" Apr 17 17:35:12.998137 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:12.998106 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-tzhl6" Apr 17 17:35:15.004886 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:15.004858 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-jnwcl" Apr 17 17:35:37.697175 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.697140 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz"] Apr 17 17:35:37.704222 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.704198 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.706718 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.706670 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"istio-ca-root-cert\"" Apr 17 17:35:37.707435 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.707149 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-gateway-1-openshift-default-dockercfg-c6rr7\"" Apr 17 17:35:37.707435 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.707258 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 17 17:35:37.707435 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.707364 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 17 17:35:37.713401 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.713073 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz"] Apr 17 17:35:37.802207 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.802172 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.802410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.802220 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.802410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.802317 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.802410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.802346 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.802410 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.802378 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.802607 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.802414 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.803088 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.803058 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qkllt\" (UniqueName: \"kubernetes.io/projected/ad21ccfc-94b0-40d2-8013-58c9bfe91201-kube-api-access-qkllt\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.803388 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.803365 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.803510 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.803464 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.904812 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.904770 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905002 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.904829 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905002 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.904859 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905002 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.904881 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905002 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.904910 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905002 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.904966 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905280 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905016 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905280 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905054 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905280 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905147 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qkllt\" (UniqueName: \"kubernetes.io/projected/ad21ccfc-94b0-40d2-8013-58c9bfe91201-kube-api-access-qkllt\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905280 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905196 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-data\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-data\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905280 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905271 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-certs\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-workload-certs\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905621 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905598 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"workload-socket\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-workload-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905712 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905691 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istiod-ca-cert\" (UniqueName: \"kubernetes.io/configmap/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istiod-ca-cert\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.905772 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.905716 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"credential-socket\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-credential-socket\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.907716 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.907677 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-podinfo\" (UniqueName: \"kubernetes.io/downward-api/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-podinfo\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.907871 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.907761 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-envoy\" (UniqueName: \"kubernetes.io/empty-dir/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-envoy\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.913526 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.913496 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/ad21ccfc-94b0-40d2-8013-58c9bfe91201-istio-token\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:37.914058 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:37.914034 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qkllt\" (UniqueName: \"kubernetes.io/projected/ad21ccfc-94b0-40d2-8013-58c9bfe91201-kube-api-access-qkllt\") pod \"router-gateway-1-openshift-default-6c59fbf55c-wkkpz\" (UID: \"ad21ccfc-94b0-40d2-8013-58c9bfe91201\") " pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:38.017891 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:38.017806 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:38.162758 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:38.162733 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz"] Apr 17 17:35:38.166226 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:35:38.166186 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad21ccfc_94b0_40d2_8013_58c9bfe91201.slice/crio-fa693fbc8a806f92e6337cdb7de6274a2ff7463dcc62f5b7eb07dfbca9f97ce6 WatchSource:0}: Error finding container fa693fbc8a806f92e6337cdb7de6274a2ff7463dcc62f5b7eb07dfbca9f97ce6: Status 404 returned error can't find the container with id fa693fbc8a806f92e6337cdb7de6274a2ff7463dcc62f5b7eb07dfbca9f97ce6 Apr 17 17:35:38.169023 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:38.168867 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:35:38.169023 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:38.168956 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:35:38.169023 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:38.169001 2565 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236228Ki","pods":"250"} Apr 17 17:35:39.113704 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:39.113670 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" event={"ID":"ad21ccfc-94b0-40d2-8013-58c9bfe91201","Type":"ContainerStarted","Data":"594dd9c4517a2866684ea9cfcbd6e9d1ba7f44736bba9f148b4131391bd43292"} Apr 17 17:35:39.113704 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:39.113706 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" event={"ID":"ad21ccfc-94b0-40d2-8013-58c9bfe91201","Type":"ContainerStarted","Data":"fa693fbc8a806f92e6337cdb7de6274a2ff7463dcc62f5b7eb07dfbca9f97ce6"} Apr 17 17:35:39.137034 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:39.136978 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" podStartSLOduration=2.136963418 podStartE2EDuration="2.136963418s" podCreationTimestamp="2026-04-17 17:35:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:35:39.134660801 +0000 UTC m=+679.792705109" watchObservedRunningTime="2026-04-17 17:35:39.136963418 +0000 UTC m=+679.795007726" Apr 17 17:35:40.017935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:40.017898 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:40.023018 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:40.022996 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:40.116963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:40.116932 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:35:40.118022 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:35:40.118000 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-gateway-1-openshift-default-6c59fbf55c-wkkpz" Apr 17 17:36:05.356689 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.356647 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95"] Apr 17 17:36:05.362737 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.362713 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.365196 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.365174 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-xgf5k\"" Apr 17 17:36:05.365764 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.365746 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:36:05.365866 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.365748 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 17 17:36:05.373094 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.373063 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95"] Apr 17 17:36:05.431916 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.431883 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-m75sq\" (UniqueName: \"kubernetes.io/projected/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kube-api-access-m75sq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.431916 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.431923 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.432109 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.431952 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.432109 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.432019 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.432109 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.432077 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.432208 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.432111 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.532944 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.532889 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.532944 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.532946 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.532988 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533016 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533071 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-m75sq\" (UniqueName: \"kubernetes.io/projected/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kube-api-access-m75sq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533104 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533467 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533378 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533467 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533398 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533467 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533437 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.533593 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.533478 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.535618 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.535594 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.541432 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.541385 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-m75sq\" (UniqueName: \"kubernetes.io/projected/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kube-api-access-m75sq\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.675700 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.675657 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:05.807692 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:05.807666 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95"] Apr 17 17:36:05.810464 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:36:05.810435 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddc2ba77e_d8b5_455c_ae42_e315be07f9db.slice/crio-e2f56bab3315af1ba9566e788e3d1824fd9c16e368bc3027dc9d63f41d978ced WatchSource:0}: Error finding container e2f56bab3315af1ba9566e788e3d1824fd9c16e368bc3027dc9d63f41d978ced: Status 404 returned error can't find the container with id e2f56bab3315af1ba9566e788e3d1824fd9c16e368bc3027dc9d63f41d978ced Apr 17 17:36:06.207168 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:06.207126 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerStarted","Data":"e2f56bab3315af1ba9566e788e3d1824fd9c16e368bc3027dc9d63f41d978ced"} Apr 17 17:36:09.223759 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:09.223724 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerStarted","Data":"17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422"} Apr 17 17:36:10.228896 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:10.228805 2565 generic.go:358] "Generic (PLEG): container finished" podID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerID="17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422" exitCode=0 Apr 17 17:36:10.228896 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:10.228846 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerDied","Data":"17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422"} Apr 17 17:36:12.238745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:12.238705 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerStarted","Data":"5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde"} Apr 17 17:36:40.352405 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:40.352367 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerStarted","Data":"144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb"} Apr 17 17:36:40.352919 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:40.352507 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:41.358963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:41.358932 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:41.384772 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:41.382213 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" podStartSLOduration=2.028783632 podStartE2EDuration="36.382194515s" podCreationTimestamp="2026-04-17 17:36:05 +0000 UTC" firstStartedPulling="2026-04-17 17:36:05.812470599 +0000 UTC m=+706.470514885" lastFinishedPulling="2026-04-17 17:36:40.165881469 +0000 UTC m=+740.823925768" observedRunningTime="2026-04-17 17:36:40.383902144 +0000 UTC m=+741.041946488" watchObservedRunningTime="2026-04-17 17:36:41.382194515 +0000 UTC m=+742.040238825" Apr 17 17:36:45.676462 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:45.676417 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:45.676885 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:45.676505 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:45.676885 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:45.676728 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.35:8082/healthz\": dial tcp 10.134.0.35:8082: connect: connection refused" Apr 17 17:36:55.677544 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:55.677508 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:55.678720 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:55.678698 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:57.249220 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:57.249182 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95"] Apr 17 17:36:57.417416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:57.417353 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="main" containerID="cri-o://5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde" gracePeriod=30 Apr 17 17:36:57.417604 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:57.417392 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="tokenizer" containerID="cri-o://144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb" gracePeriod=30 Apr 17 17:36:58.422640 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.422609 2565 generic.go:358] "Generic (PLEG): container finished" podID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerID="5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde" exitCode=0 Apr 17 17:36:58.423003 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.422684 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerDied","Data":"5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde"} Apr 17 17:36:58.658652 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.658629 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:58.748188 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748104 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tls-certs\") pod \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " Apr 17 17:36:58.748188 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748147 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-tmp\") pod \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " Apr 17 17:36:58.748420 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748227 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-m75sq\" (UniqueName: \"kubernetes.io/projected/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kube-api-access-m75sq\") pod \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " Apr 17 17:36:58.748420 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748286 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-uds\") pod \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " Apr 17 17:36:58.748420 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748308 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kserve-provision-location\") pod \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " Apr 17 17:36:58.748420 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748330 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-cache\") pod \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\" (UID: \"dc2ba77e-d8b5-455c-ae42-e315be07f9db\") " Apr 17 17:36:58.748627 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748575 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "dc2ba77e-d8b5-455c-ae42-e315be07f9db" (UID: "dc2ba77e-d8b5-455c-ae42-e315be07f9db"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:58.748627 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748595 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "dc2ba77e-d8b5-455c-ae42-e315be07f9db" (UID: "dc2ba77e-d8b5-455c-ae42-e315be07f9db"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:58.748784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.748752 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "dc2ba77e-d8b5-455c-ae42-e315be07f9db" (UID: "dc2ba77e-d8b5-455c-ae42-e315be07f9db"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:58.749075 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.749056 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dc2ba77e-d8b5-455c-ae42-e315be07f9db" (UID: "dc2ba77e-d8b5-455c-ae42-e315be07f9db"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:36:58.750463 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.750444 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dc2ba77e-d8b5-455c-ae42-e315be07f9db" (UID: "dc2ba77e-d8b5-455c-ae42-e315be07f9db"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:36:58.750533 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.750477 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kube-api-access-m75sq" (OuterVolumeSpecName: "kube-api-access-m75sq") pod "dc2ba77e-d8b5-455c-ae42-e315be07f9db" (UID: "dc2ba77e-d8b5-455c-ae42-e315be07f9db"). InnerVolumeSpecName "kube-api-access-m75sq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:36:58.849399 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.849358 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:36:58.849399 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.849392 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:36:58.849399 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.849402 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-m75sq\" (UniqueName: \"kubernetes.io/projected/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kube-api-access-m75sq\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:36:58.849628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.849414 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:36:58.849628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.849424 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:36:58.849628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:58.849433 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/dc2ba77e-d8b5-455c-ae42-e315be07f9db-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:36:59.428126 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.428092 2565 generic.go:358] "Generic (PLEG): container finished" podID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerID="144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb" exitCode=0 Apr 17 17:36:59.428597 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.428141 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerDied","Data":"144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb"} Apr 17 17:36:59.428597 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.428164 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" event={"ID":"dc2ba77e-d8b5-455c-ae42-e315be07f9db","Type":"ContainerDied","Data":"e2f56bab3315af1ba9566e788e3d1824fd9c16e368bc3027dc9d63f41d978ced"} Apr 17 17:36:59.428597 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.428178 2565 scope.go:117] "RemoveContainer" containerID="144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb" Apr 17 17:36:59.428597 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.428181 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95" Apr 17 17:36:59.437015 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.436997 2565 scope.go:117] "RemoveContainer" containerID="5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde" Apr 17 17:36:59.444056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.444039 2565 scope.go:117] "RemoveContainer" containerID="17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422" Apr 17 17:36:59.450528 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.450502 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95"] Apr 17 17:36:59.451657 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.451633 2565 scope.go:117] "RemoveContainer" containerID="144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb" Apr 17 17:36:59.451913 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:36:59.451894 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb\": container with ID starting with 144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb not found: ID does not exist" containerID="144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb" Apr 17 17:36:59.451970 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.451923 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb"} err="failed to get container status \"144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb\": rpc error: code = NotFound desc = could not find container \"144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb\": container with ID starting with 144e7d2e6c0254bb27c17565384cfe207c14dd6fcf66a267e780cf360c7743bb not found: ID does not exist" Apr 17 17:36:59.451970 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.451941 2565 scope.go:117] "RemoveContainer" containerID="5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde" Apr 17 17:36:59.452237 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:36:59.452186 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde\": container with ID starting with 5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde not found: ID does not exist" containerID="5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde" Apr 17 17:36:59.452237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.452221 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde"} err="failed to get container status \"5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde\": rpc error: code = NotFound desc = could not find container \"5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde\": container with ID starting with 5e17e7c14fbdb02b99deb873be7cc52af34777d9916dcf22f1222c8c5b709dde not found: ID does not exist" Apr 17 17:36:59.452237 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.452259 2565 scope.go:117] "RemoveContainer" containerID="17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422" Apr 17 17:36:59.452662 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:36:59.452635 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422\": container with ID starting with 17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422 not found: ID does not exist" containerID="17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422" Apr 17 17:36:59.452743 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.452666 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422"} err="failed to get container status \"17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422\": rpc error: code = NotFound desc = could not find container \"17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422\": container with ID starting with 17e4dba96864c0a223f03dabe2f62e75c166111b1f8dbbeced87319cd7446422 not found: ID does not exist" Apr 17 17:36:59.454602 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.454581 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-5fd8459w95"] Apr 17 17:36:59.953024 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:36:59.952983 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" path="/var/lib/kubelet/pods/dc2ba77e-d8b5-455c-ae42-e315be07f9db/volumes" Apr 17 17:37:07.127470 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.127430 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59"] Apr 17 17:37:07.128056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.127983 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="main" Apr 17 17:37:07.128056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128001 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="main" Apr 17 17:37:07.128056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128016 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="storage-initializer" Apr 17 17:37:07.128056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128025 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="storage-initializer" Apr 17 17:37:07.128056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128037 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="tokenizer" Apr 17 17:37:07.128056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128046 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="tokenizer" Apr 17 17:37:07.128344 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128125 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="main" Apr 17 17:37:07.128344 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.128141 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="dc2ba77e-d8b5-455c-ae42-e315be07f9db" containerName="tokenizer" Apr 17 17:37:07.411860 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.411813 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59"] Apr 17 17:37:07.412059 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.412030 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.415819 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.415795 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 17 17:37:07.415964 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.415822 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:37:07.415964 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.415800 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-z57f5\"" Apr 17 17:37:07.420899 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.420877 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.420995 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.420919 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.421059 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.421025 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9275d\" (UniqueName: \"kubernetes.io/projected/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kube-api-access-9275d\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.421110 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.421083 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.421110 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.421103 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.421187 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.421122 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.521720 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.521677 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.521918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.521734 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.521918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.521774 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9275d\" (UniqueName: \"kubernetes.io/projected/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kube-api-access-9275d\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.521918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.521806 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.521918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.521831 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.521918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.521860 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.522238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.522213 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.522349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.522269 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.522349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.522294 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.522457 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.522395 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.524700 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.524673 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.530685 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.530662 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9275d\" (UniqueName: \"kubernetes.io/projected/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kube-api-access-9275d\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:07.721880 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:07.721799 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:08.056719 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:08.056641 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59"] Apr 17 17:37:08.060340 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:37:08.060300 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2ce3e895_69ab_4fa8_a8f7_10a0284b04be.slice/crio-24caf0a3f04519884c877f5a4ad047677b3dabd71cc00ffb283913e2d1c63657 WatchSource:0}: Error finding container 24caf0a3f04519884c877f5a4ad047677b3dabd71cc00ffb283913e2d1c63657: Status 404 returned error can't find the container with id 24caf0a3f04519884c877f5a4ad047677b3dabd71cc00ffb283913e2d1c63657 Apr 17 17:37:08.465744 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:08.465702 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerStarted","Data":"122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7"} Apr 17 17:37:08.465744 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:08.465744 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerStarted","Data":"24caf0a3f04519884c877f5a4ad047677b3dabd71cc00ffb283913e2d1c63657"} Apr 17 17:37:09.470152 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:09.470117 2565 generic.go:358] "Generic (PLEG): container finished" podID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerID="122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7" exitCode=0 Apr 17 17:37:09.470658 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:09.470182 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerDied","Data":"122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7"} Apr 17 17:37:10.475901 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:10.475864 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerStarted","Data":"0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d"} Apr 17 17:37:10.475901 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:10.475907 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerStarted","Data":"df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad"} Apr 17 17:37:10.476346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:10.475986 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:10.498149 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:10.498088 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" podStartSLOduration=3.498066749 podStartE2EDuration="3.498066749s" podCreationTimestamp="2026-04-17 17:37:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:37:10.495934596 +0000 UTC m=+771.153978941" watchObservedRunningTime="2026-04-17 17:37:10.498066749 +0000 UTC m=+771.156111058" Apr 17 17:37:17.722389 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:17.722350 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:17.722993 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:17.722404 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:17.724952 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:17.724928 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:18.503447 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:18.503416 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:25.934464 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:25.934428 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn"] Apr 17 17:37:25.947080 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:25.947052 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:25.949800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:25.949547 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-j68mk\"" Apr 17 17:37:25.950623 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:25.950600 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 17 17:37:25.954507 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:25.954484 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn"] Apr 17 17:37:26.098086 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.098050 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.098302 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.098195 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.098302 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.098229 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwr9f\" (UniqueName: \"kubernetes.io/projected/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kube-api-access-dwr9f\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.098302 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.098280 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.098433 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.098341 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.098494 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.098469 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199067 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.198980 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199067 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199030 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199067 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199057 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwr9f\" (UniqueName: \"kubernetes.io/projected/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kube-api-access-dwr9f\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199376 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199077 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199376 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199108 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199376 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199175 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199528 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199401 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199528 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199475 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199528 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199494 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.199628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.199552 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.201532 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.201507 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.207806 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.207783 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwr9f\" (UniqueName: \"kubernetes.io/projected/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kube-api-access-dwr9f\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.259268 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.259215 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:26.392810 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.392784 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn"] Apr 17 17:37:26.395042 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:37:26.395009 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeed870e7_a8e3_452d_9200_9c7eed4f6f71.slice/crio-ef04713b0b876caf8d46a5e141eaf99faeb87ec65b19816b28e4cfd1a2f61c25 WatchSource:0}: Error finding container ef04713b0b876caf8d46a5e141eaf99faeb87ec65b19816b28e4cfd1a2f61c25: Status 404 returned error can't find the container with id ef04713b0b876caf8d46a5e141eaf99faeb87ec65b19816b28e4cfd1a2f61c25 Apr 17 17:37:26.531477 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.531443 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerStarted","Data":"78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842"} Apr 17 17:37:26.531477 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:26.531482 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerStarted","Data":"ef04713b0b876caf8d46a5e141eaf99faeb87ec65b19816b28e4cfd1a2f61c25"} Apr 17 17:37:27.536349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:27.536230 2565 generic.go:358] "Generic (PLEG): container finished" podID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerID="78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842" exitCode=0 Apr 17 17:37:27.536349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:27.536316 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerDied","Data":"78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842"} Apr 17 17:37:28.542943 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:28.542901 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerStarted","Data":"96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0"} Apr 17 17:37:28.542943 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:28.542948 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerStarted","Data":"b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37"} Apr 17 17:37:28.543457 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:28.543081 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:28.567827 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:28.567761 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" podStartSLOduration=3.567739809 podStartE2EDuration="3.567739809s" podCreationTimestamp="2026-04-17 17:37:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:37:28.564217533 +0000 UTC m=+789.222261841" watchObservedRunningTime="2026-04-17 17:37:28.567739809 +0000 UTC m=+789.225784118" Apr 17 17:37:36.259635 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:36.259590 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:36.260197 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:36.259649 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:36.262146 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:36.262127 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:36.571065 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:36.570983 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:39.507307 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:39.507223 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:37:57.575202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:57.575173 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:37:58.692498 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:58.692464 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn"] Apr 17 17:37:58.692918 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:58.692846 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="main" containerID="cri-o://b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37" gracePeriod=30 Apr 17 17:37:58.692991 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:58.692902 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="tokenizer" containerID="cri-o://96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0" gracePeriod=30 Apr 17 17:37:59.661042 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:59.661009 2565 generic.go:358] "Generic (PLEG): container finished" podID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerID="b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37" exitCode=0 Apr 17 17:37:59.661223 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:37:59.661091 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerDied","Data":"b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37"} Apr 17 17:38:00.046104 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.046079 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:38:00.099830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.099804 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tls-certs\") pod \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " Apr 17 17:38:00.099988 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.099880 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-tmp\") pod \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " Apr 17 17:38:00.099988 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.099964 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-uds\") pod \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " Apr 17 17:38:00.100060 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.099994 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-cache\") pod \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " Apr 17 17:38:00.100093 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100057 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dwr9f\" (UniqueName: \"kubernetes.io/projected/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kube-api-access-dwr9f\") pod \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " Apr 17 17:38:00.100093 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100081 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kserve-provision-location\") pod \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\" (UID: \"eed870e7-a8e3-452d-9200-9c7eed4f6f71\") " Apr 17 17:38:00.100216 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100185 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "eed870e7-a8e3-452d-9200-9c7eed4f6f71" (UID: "eed870e7-a8e3-452d-9200-9c7eed4f6f71"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:00.100216 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100203 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "eed870e7-a8e3-452d-9200-9c7eed4f6f71" (UID: "eed870e7-a8e3-452d-9200-9c7eed4f6f71"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:00.100356 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100320 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "eed870e7-a8e3-452d-9200-9c7eed4f6f71" (UID: "eed870e7-a8e3-452d-9200-9c7eed4f6f71"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:00.100416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100390 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:00.100416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100404 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:00.100948 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.100921 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "eed870e7-a8e3-452d-9200-9c7eed4f6f71" (UID: "eed870e7-a8e3-452d-9200-9c7eed4f6f71"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:00.102074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.102049 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "eed870e7-a8e3-452d-9200-9c7eed4f6f71" (UID: "eed870e7-a8e3-452d-9200-9c7eed4f6f71"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:00.102157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.102099 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kube-api-access-dwr9f" (OuterVolumeSpecName: "kube-api-access-dwr9f") pod "eed870e7-a8e3-452d-9200-9c7eed4f6f71" (UID: "eed870e7-a8e3-452d-9200-9c7eed4f6f71"). InnerVolumeSpecName "kube-api-access-dwr9f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:00.201651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.201614 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:00.201651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.201645 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dwr9f\" (UniqueName: \"kubernetes.io/projected/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kube-api-access-dwr9f\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:00.201651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.201656 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/eed870e7-a8e3-452d-9200-9c7eed4f6f71-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:00.201891 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.201668 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/eed870e7-a8e3-452d-9200-9c7eed4f6f71-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:00.666074 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.666038 2565 generic.go:358] "Generic (PLEG): container finished" podID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerID="96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0" exitCode=0 Apr 17 17:38:00.666279 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.666112 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" Apr 17 17:38:00.666279 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.666127 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerDied","Data":"96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0"} Apr 17 17:38:00.666279 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.666165 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn" event={"ID":"eed870e7-a8e3-452d-9200-9c7eed4f6f71","Type":"ContainerDied","Data":"ef04713b0b876caf8d46a5e141eaf99faeb87ec65b19816b28e4cfd1a2f61c25"} Apr 17 17:38:00.666279 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.666182 2565 scope.go:117] "RemoveContainer" containerID="96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0" Apr 17 17:38:00.676809 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.676786 2565 scope.go:117] "RemoveContainer" containerID="b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37" Apr 17 17:38:00.684041 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.684023 2565 scope.go:117] "RemoveContainer" containerID="78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842" Apr 17 17:38:00.692356 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.692325 2565 scope.go:117] "RemoveContainer" containerID="96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0" Apr 17 17:38:00.692490 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.692469 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn"] Apr 17 17:38:00.692619 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:00.692600 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0\": container with ID starting with 96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0 not found: ID does not exist" containerID="96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0" Apr 17 17:38:00.692686 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.692634 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0"} err="failed to get container status \"96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0\": rpc error: code = NotFound desc = could not find container \"96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0\": container with ID starting with 96a9caa484bc3caf072947840fcb60cbdb574dfc450f1c0a6b7e9946b1b266c0 not found: ID does not exist" Apr 17 17:38:00.692686 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.692662 2565 scope.go:117] "RemoveContainer" containerID="b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37" Apr 17 17:38:00.692931 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:00.692914 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37\": container with ID starting with b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37 not found: ID does not exist" containerID="b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37" Apr 17 17:38:00.693003 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.692938 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37"} err="failed to get container status \"b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37\": rpc error: code = NotFound desc = could not find container \"b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37\": container with ID starting with b1e2ca14313335024f550b19d95f5cb647560b597d470237d2748131c27b9f37 not found: ID does not exist" Apr 17 17:38:00.693003 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.692958 2565 scope.go:117] "RemoveContainer" containerID="78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842" Apr 17 17:38:00.693861 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:00.693404 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842\": container with ID starting with 78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842 not found: ID does not exist" containerID="78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842" Apr 17 17:38:00.693861 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.693454 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842"} err="failed to get container status \"78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842\": rpc error: code = NotFound desc = could not find container \"78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842\": container with ID starting with 78f01ad4972086aab0ae63a1794098f4ff951f5d55630507d3e4f32dca1ee842 not found: ID does not exist" Apr 17 17:38:00.695951 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:00.695908 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-d58dc5bzdjrn"] Apr 17 17:38:01.951715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:01.951680 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" path="/var/lib/kubelet/pods/eed870e7-a8e3-452d-9200-9c7eed4f6f71/volumes" Apr 17 17:38:06.926122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926086 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m"] Apr 17 17:38:06.926543 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926470 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="storage-initializer" Apr 17 17:38:06.926543 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926494 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="storage-initializer" Apr 17 17:38:06.926543 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926508 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="main" Apr 17 17:38:06.926543 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926515 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="main" Apr 17 17:38:06.926543 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926522 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="tokenizer" Apr 17 17:38:06.926543 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926527 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="tokenizer" Apr 17 17:38:06.926748 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926576 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="tokenizer" Apr 17 17:38:06.926748 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.926584 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="eed870e7-a8e3-452d-9200-9c7eed4f6f71" containerName="main" Apr 17 17:38:06.932188 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.932157 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:06.934476 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.934459 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 17 17:38:06.937994 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.937974 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m"] Apr 17 17:38:06.961573 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.961543 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-dshm\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:06.961788 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.961593 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-model-cache\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:06.961788 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.961663 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-home\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:06.961788 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.961704 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:06.961788 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.961778 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ckkrp\" (UniqueName: \"kubernetes.io/projected/da708f25-ddc9-404d-ae37-6476da026dff-kube-api-access-ckkrp\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:06.961932 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:06.961799 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da708f25-ddc9-404d-ae37-6476da026dff-tls-certs\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.062694 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.062659 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ckkrp\" (UniqueName: \"kubernetes.io/projected/da708f25-ddc9-404d-ae37-6476da026dff-kube-api-access-ckkrp\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.062694 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.062696 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da708f25-ddc9-404d-ae37-6476da026dff-tls-certs\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.062932 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.062733 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-dshm\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.062932 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.062758 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-model-cache\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.062932 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.062782 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-home\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.062932 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.062807 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.063182 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.063150 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.063340 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.063215 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-home\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.063340 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.063273 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-model-cache\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.065055 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.065029 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-dshm\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.065266 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.065224 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da708f25-ddc9-404d-ae37-6476da026dff-tls-certs\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.071115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.071097 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ckkrp\" (UniqueName: \"kubernetes.io/projected/da708f25-ddc9-404d-ae37-6476da026dff-kube-api-access-ckkrp\") pod \"precise-prefix-cache-test-kserve-849dfd55d5-wwp7m\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.152322 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.152288 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj"] Apr 17 17:38:07.156326 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.156303 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.160071 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.160048 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-kk26n\"" Apr 17 17:38:07.167697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.167673 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj"] Apr 17 17:38:07.244138 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.244054 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:07.264973 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.264943 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.265098 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.264983 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.265098 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.265007 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.265202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.265099 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.265202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.265126 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.265202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.265156 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nczfz\" (UniqueName: \"kubernetes.io/projected/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kube-api-access-nczfz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366313 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366359 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366395 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366438 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366460 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366481 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nczfz\" (UniqueName: \"kubernetes.io/projected/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kube-api-access-nczfz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366780 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366743 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366877 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366755 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366877 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366808 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.366877 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.366866 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.368811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.368785 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.378811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.378788 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nczfz\" (UniqueName: \"kubernetes.io/projected/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kube-api-access-nczfz\") pod \"precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.382815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.382790 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m"] Apr 17 17:38:07.384535 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:38:07.384512 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda708f25_ddc9_404d_ae37_6476da026dff.slice/crio-d91338461d0d8e0d7dd376b373db0100ef2286a29476030f7e757734ca719535 WatchSource:0}: Error finding container d91338461d0d8e0d7dd376b373db0100ef2286a29476030f7e757734ca719535: Status 404 returned error can't find the container with id d91338461d0d8e0d7dd376b373db0100ef2286a29476030f7e757734ca719535 Apr 17 17:38:07.467287 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.467225 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:07.691679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.691647 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" event={"ID":"da708f25-ddc9-404d-ae37-6476da026dff","Type":"ContainerStarted","Data":"8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa"} Apr 17 17:38:07.691679 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.691682 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" event={"ID":"da708f25-ddc9-404d-ae37-6476da026dff","Type":"ContainerStarted","Data":"d91338461d0d8e0d7dd376b373db0100ef2286a29476030f7e757734ca719535"} Apr 17 17:38:07.811477 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:07.811448 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj"] Apr 17 17:38:07.815375 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:38:07.815343 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83f307fa_447a_4d45_997d_e5d2fb4cfd25.slice/crio-0c4dd6c6bac73e83bf2e44376b98647c06eee53f92b717e0b5f55f7522c354d6 WatchSource:0}: Error finding container 0c4dd6c6bac73e83bf2e44376b98647c06eee53f92b717e0b5f55f7522c354d6: Status 404 returned error can't find the container with id 0c4dd6c6bac73e83bf2e44376b98647c06eee53f92b717e0b5f55f7522c354d6 Apr 17 17:38:08.697182 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:08.697145 2565 generic.go:358] "Generic (PLEG): container finished" podID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerID="582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30" exitCode=0 Apr 17 17:38:08.697654 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:08.697223 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerDied","Data":"582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30"} Apr 17 17:38:08.697654 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:08.697281 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerStarted","Data":"0c4dd6c6bac73e83bf2e44376b98647c06eee53f92b717e0b5f55f7522c354d6"} Apr 17 17:38:09.702168 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:09.702136 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerStarted","Data":"59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181"} Apr 17 17:38:09.702168 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:09.702171 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerStarted","Data":"aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35"} Apr 17 17:38:09.702719 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:09.702278 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:09.725500 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:09.725441 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" podStartSLOduration=2.725422086 podStartE2EDuration="2.725422086s" podCreationTimestamp="2026-04-17 17:38:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:38:09.721909294 +0000 UTC m=+830.379953610" watchObservedRunningTime="2026-04-17 17:38:09.725422086 +0000 UTC m=+830.383466395" Apr 17 17:38:12.714011 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:12.713976 2565 generic.go:358] "Generic (PLEG): container finished" podID="da708f25-ddc9-404d-ae37-6476da026dff" containerID="8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa" exitCode=0 Apr 17 17:38:12.714415 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:12.714058 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" event={"ID":"da708f25-ddc9-404d-ae37-6476da026dff","Type":"ContainerDied","Data":"8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa"} Apr 17 17:38:14.727274 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:14.727224 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" event={"ID":"da708f25-ddc9-404d-ae37-6476da026dff","Type":"ContainerStarted","Data":"35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7"} Apr 17 17:38:14.748035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:14.747976 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" podStartSLOduration=7.698922868 podStartE2EDuration="8.747962988s" podCreationTimestamp="2026-04-17 17:38:06 +0000 UTC" firstStartedPulling="2026-04-17 17:38:12.715195534 +0000 UTC m=+833.373239819" lastFinishedPulling="2026-04-17 17:38:13.764235636 +0000 UTC m=+834.422279939" observedRunningTime="2026-04-17 17:38:14.745538765 +0000 UTC m=+835.403583065" watchObservedRunningTime="2026-04-17 17:38:14.747962988 +0000 UTC m=+835.406007296" Apr 17 17:38:17.244712 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.244671 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:17.245202 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.244990 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:17.257442 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.257416 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:17.468421 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.468381 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:17.468421 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.468426 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:17.469700 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:38:17.469675 2565 logging.go:55] [core] [Channel #71 SubChannel #72]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.39:9003", ServerName: "10.134.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.39:9003: connect: connection refused" Apr 17 17:38:17.470992 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.470970 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:17.738199 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.738170 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:17.748691 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:17.748667 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:18.469335 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:18.469287 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.39:9003\" within 1s: context deadline exceeded" Apr 17 17:38:27.468184 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:38:27.468149 2565 logging.go:55] [core] [Channel #79 SubChannel #80]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.39:9003", ServerName: "10.134.0.39:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.39:9003: connect: connection refused" Apr 17 17:38:28.467912 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:28.467843 2565 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.39:9003\" within 1s: context deadline exceeded" Apr 17 17:38:38.742216 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:38.742187 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:39.725084 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:39.725053 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj"] Apr 17 17:38:39.725459 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:39.725428 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="main" containerID="cri-o://aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35" gracePeriod=30 Apr 17 17:38:39.726023 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:39.725806 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="tokenizer" containerID="cri-o://59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181" gracePeriod=30 Apr 17 17:38:39.735271 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:39.734352 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m"] Apr 17 17:38:39.735271 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:39.734680 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" podUID="da708f25-ddc9-404d-ae37-6476da026dff" containerName="main" containerID="cri-o://35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7" gracePeriod=30 Apr 17 17:38:39.987765 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:39.987743 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:40.065963 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.065935 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-kserve-provision-location\") pod \"da708f25-ddc9-404d-ae37-6476da026dff\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " Apr 17 17:38:40.066131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.065976 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-dshm\") pod \"da708f25-ddc9-404d-ae37-6476da026dff\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " Apr 17 17:38:40.066131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.065993 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-home\") pod \"da708f25-ddc9-404d-ae37-6476da026dff\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " Apr 17 17:38:40.066131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066026 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da708f25-ddc9-404d-ae37-6476da026dff-tls-certs\") pod \"da708f25-ddc9-404d-ae37-6476da026dff\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " Apr 17 17:38:40.066131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066065 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-model-cache\") pod \"da708f25-ddc9-404d-ae37-6476da026dff\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " Apr 17 17:38:40.066131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066086 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ckkrp\" (UniqueName: \"kubernetes.io/projected/da708f25-ddc9-404d-ae37-6476da026dff-kube-api-access-ckkrp\") pod \"da708f25-ddc9-404d-ae37-6476da026dff\" (UID: \"da708f25-ddc9-404d-ae37-6476da026dff\") " Apr 17 17:38:40.066415 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066332 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-home" (OuterVolumeSpecName: "home") pod "da708f25-ddc9-404d-ae37-6476da026dff" (UID: "da708f25-ddc9-404d-ae37-6476da026dff"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.066415 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066382 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-model-cache" (OuterVolumeSpecName: "model-cache") pod "da708f25-ddc9-404d-ae37-6476da026dff" (UID: "da708f25-ddc9-404d-ae37-6476da026dff"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.066785 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066763 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-home\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.066893 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.066786 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-model-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.068405 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.068377 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da708f25-ddc9-404d-ae37-6476da026dff-kube-api-access-ckkrp" (OuterVolumeSpecName: "kube-api-access-ckkrp") pod "da708f25-ddc9-404d-ae37-6476da026dff" (UID: "da708f25-ddc9-404d-ae37-6476da026dff"). InnerVolumeSpecName "kube-api-access-ckkrp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:40.068554 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.068525 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-dshm" (OuterVolumeSpecName: "dshm") pod "da708f25-ddc9-404d-ae37-6476da026dff" (UID: "da708f25-ddc9-404d-ae37-6476da026dff"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.068554 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.068543 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da708f25-ddc9-404d-ae37-6476da026dff-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "da708f25-ddc9-404d-ae37-6476da026dff" (UID: "da708f25-ddc9-404d-ae37-6476da026dff"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:40.122480 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.122440 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da708f25-ddc9-404d-ae37-6476da026dff" (UID: "da708f25-ddc9-404d-ae37-6476da026dff"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:40.167291 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.167233 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ckkrp\" (UniqueName: \"kubernetes.io/projected/da708f25-ddc9-404d-ae37-6476da026dff-kube-api-access-ckkrp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.167291 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.167291 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.167540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.167303 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/da708f25-ddc9-404d-ae37-6476da026dff-dshm\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.167540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.167315 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da708f25-ddc9-404d-ae37-6476da026dff-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:40.816348 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.816311 2565 generic.go:358] "Generic (PLEG): container finished" podID="da708f25-ddc9-404d-ae37-6476da026dff" containerID="35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7" exitCode=0 Apr 17 17:38:40.816553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.816393 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" Apr 17 17:38:40.816553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.816400 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" event={"ID":"da708f25-ddc9-404d-ae37-6476da026dff","Type":"ContainerDied","Data":"35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7"} Apr 17 17:38:40.816553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.816452 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m" event={"ID":"da708f25-ddc9-404d-ae37-6476da026dff","Type":"ContainerDied","Data":"d91338461d0d8e0d7dd376b373db0100ef2286a29476030f7e757734ca719535"} Apr 17 17:38:40.816553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.816474 2565 scope.go:117] "RemoveContainer" containerID="35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7" Apr 17 17:38:40.818731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.818706 2565 generic.go:358] "Generic (PLEG): container finished" podID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerID="aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35" exitCode=0 Apr 17 17:38:40.818862 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.818757 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerDied","Data":"aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35"} Apr 17 17:38:40.825468 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.825452 2565 scope.go:117] "RemoveContainer" containerID="8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa" Apr 17 17:38:40.835201 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.835185 2565 scope.go:117] "RemoveContainer" containerID="35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7" Apr 17 17:38:40.835458 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:40.835438 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7\": container with ID starting with 35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7 not found: ID does not exist" containerID="35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7" Apr 17 17:38:40.835542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.835467 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7"} err="failed to get container status \"35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7\": rpc error: code = NotFound desc = could not find container \"35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7\": container with ID starting with 35326f8ad46910189286db358d71585dc6e5548fcafbede62cb2866cf9289ea7 not found: ID does not exist" Apr 17 17:38:40.835542 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.835485 2565 scope.go:117] "RemoveContainer" containerID="8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa" Apr 17 17:38:40.835710 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:40.835692 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa\": container with ID starting with 8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa not found: ID does not exist" containerID="8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa" Apr 17 17:38:40.835777 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.835716 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa"} err="failed to get container status \"8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa\": rpc error: code = NotFound desc = could not find container \"8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa\": container with ID starting with 8ca21cf887ebac5de3f9c1a7beebdb5fd9e814668e3bccf82e259750a066affa not found: ID does not exist" Apr 17 17:38:40.839112 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.839088 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m"] Apr 17 17:38:40.846677 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:40.846652 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-849dfd55d5-wwp7m"] Apr 17 17:38:41.303024 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.303001 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:41.377952 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.377876 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kserve-provision-location\") pod \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " Apr 17 17:38:41.377952 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.377932 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-uds\") pod \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " Apr 17 17:38:41.378185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.377988 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-cache\") pod \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " Apr 17 17:38:41.378185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378016 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tls-certs\") pod \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " Apr 17 17:38:41.378185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378034 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nczfz\" (UniqueName: \"kubernetes.io/projected/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kube-api-access-nczfz\") pod \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " Apr 17 17:38:41.378185 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378069 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-tmp\") pod \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\" (UID: \"83f307fa-447a-4d45-997d-e5d2fb4cfd25\") " Apr 17 17:38:41.378429 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378288 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "83f307fa-447a-4d45-997d-e5d2fb4cfd25" (UID: "83f307fa-447a-4d45-997d-e5d2fb4cfd25"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:41.378429 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378344 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "83f307fa-447a-4d45-997d-e5d2fb4cfd25" (UID: "83f307fa-447a-4d45-997d-e5d2fb4cfd25"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:41.378550 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378522 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "83f307fa-447a-4d45-997d-e5d2fb4cfd25" (UID: "83f307fa-447a-4d45-997d-e5d2fb4cfd25"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:41.378746 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.378722 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83f307fa-447a-4d45-997d-e5d2fb4cfd25" (UID: "83f307fa-447a-4d45-997d-e5d2fb4cfd25"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:38:41.380281 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.380185 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kube-api-access-nczfz" (OuterVolumeSpecName: "kube-api-access-nczfz") pod "83f307fa-447a-4d45-997d-e5d2fb4cfd25" (UID: "83f307fa-447a-4d45-997d-e5d2fb4cfd25"). InnerVolumeSpecName "kube-api-access-nczfz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:38:41.380621 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.380598 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "83f307fa-447a-4d45-997d-e5d2fb4cfd25" (UID: "83f307fa-447a-4d45-997d-e5d2fb4cfd25"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:38:41.478888 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.478848 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:41.478888 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.478882 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:41.478888 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.478893 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nczfz\" (UniqueName: \"kubernetes.io/projected/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kube-api-access-nczfz\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:41.479115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.478903 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:41.479115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.478913 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:41.479115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.478923 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/83f307fa-447a-4d45-997d-e5d2fb4cfd25-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:38:41.830181 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.830147 2565 generic.go:358] "Generic (PLEG): container finished" podID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerID="59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181" exitCode=0 Apr 17 17:38:41.830377 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.830198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerDied","Data":"59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181"} Apr 17 17:38:41.830377 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.830231 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" event={"ID":"83f307fa-447a-4d45-997d-e5d2fb4cfd25","Type":"ContainerDied","Data":"0c4dd6c6bac73e83bf2e44376b98647c06eee53f92b717e0b5f55f7522c354d6"} Apr 17 17:38:41.830377 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.830275 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj" Apr 17 17:38:41.830377 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.830271 2565 scope.go:117] "RemoveContainer" containerID="59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181" Apr 17 17:38:41.838936 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.838918 2565 scope.go:117] "RemoveContainer" containerID="aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35" Apr 17 17:38:41.848023 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.848003 2565 scope.go:117] "RemoveContainer" containerID="582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30" Apr 17 17:38:41.854425 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.854402 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj"] Apr 17 17:38:41.855898 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.855879 2565 scope.go:117] "RemoveContainer" containerID="59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181" Apr 17 17:38:41.856163 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:41.856142 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181\": container with ID starting with 59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181 not found: ID does not exist" containerID="59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181" Apr 17 17:38:41.856233 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.856178 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181"} err="failed to get container status \"59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181\": rpc error: code = NotFound desc = could not find container \"59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181\": container with ID starting with 59e9ed0d0bff14572c50282f0b242bcdee488a1f174128d792f950ba92bc7181 not found: ID does not exist" Apr 17 17:38:41.856233 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.856205 2565 scope.go:117] "RemoveContainer" containerID="aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35" Apr 17 17:38:41.856515 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:41.856489 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35\": container with ID starting with aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35 not found: ID does not exist" containerID="aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35" Apr 17 17:38:41.856771 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.856518 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35"} err="failed to get container status \"aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35\": rpc error: code = NotFound desc = could not find container \"aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35\": container with ID starting with aeae8e44352335607423945ddb7b221aca6708232433eefdc91c8f8f757eeb35 not found: ID does not exist" Apr 17 17:38:41.856771 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.856534 2565 scope.go:117] "RemoveContainer" containerID="582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30" Apr 17 17:38:41.856920 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:38:41.856891 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30\": container with ID starting with 582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30 not found: ID does not exist" containerID="582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30" Apr 17 17:38:41.856985 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.856924 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30"} err="failed to get container status \"582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30\": rpc error: code = NotFound desc = could not find container \"582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30\": container with ID starting with 582cbceadb38e4a0a069817e43355dff5b104139af58e4bf214a8a0ac229cf30 not found: ID does not exist" Apr 17 17:38:41.859131 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.859109 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-5b5569d46q7xj"] Apr 17 17:38:41.951948 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.951912 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" path="/var/lib/kubelet/pods/83f307fa-447a-4d45-997d-e5d2fb4cfd25/volumes" Apr 17 17:38:41.952422 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:38:41.952407 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da708f25-ddc9-404d-ae37-6476da026dff" path="/var/lib/kubelet/pods/da708f25-ddc9-404d-ae37-6476da026dff/volumes" Apr 17 17:39:26.676954 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:26.676919 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59"] Apr 17 17:39:26.679178 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:26.677231 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="main" containerID="cri-o://df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad" gracePeriod=30 Apr 17 17:39:26.679178 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:26.677313 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="tokenizer" containerID="cri-o://0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d" gracePeriod=30 Apr 17 17:39:26.983455 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:26.983363 2565 generic.go:358] "Generic (PLEG): container finished" podID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerID="df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad" exitCode=0 Apr 17 17:39:26.983455 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:26.983441 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerDied","Data":"df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad"} Apr 17 17:39:27.928440 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.928413 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:39:27.987969 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.987938 2565 generic.go:358] "Generic (PLEG): container finished" podID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerID="0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d" exitCode=0 Apr 17 17:39:27.988115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.988011 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" Apr 17 17:39:27.988115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.988017 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerDied","Data":"0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d"} Apr 17 17:39:27.988115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.988056 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59" event={"ID":"2ce3e895-69ab-4fa8-a8f7-10a0284b04be","Type":"ContainerDied","Data":"24caf0a3f04519884c877f5a4ad047677b3dabd71cc00ffb283913e2d1c63657"} Apr 17 17:39:27.988115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.988072 2565 scope.go:117] "RemoveContainer" containerID="0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d" Apr 17 17:39:27.995844 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:27.995824 2565 scope.go:117] "RemoveContainer" containerID="df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad" Apr 17 17:39:28.002744 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.002728 2565 scope.go:117] "RemoveContainer" containerID="122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7" Apr 17 17:39:28.009682 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.009667 2565 scope.go:117] "RemoveContainer" containerID="0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d" Apr 17 17:39:28.009908 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:39:28.009889 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d\": container with ID starting with 0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d not found: ID does not exist" containerID="0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d" Apr 17 17:39:28.009956 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.009920 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d"} err="failed to get container status \"0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d\": rpc error: code = NotFound desc = could not find container \"0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d\": container with ID starting with 0a5ff179bb3cc0bf2b55be5ef676593f2d0e26659bcf81a432551e3085aaf37d not found: ID does not exist" Apr 17 17:39:28.009956 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.009938 2565 scope.go:117] "RemoveContainer" containerID="df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad" Apr 17 17:39:28.010144 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:39:28.010127 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad\": container with ID starting with df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad not found: ID does not exist" containerID="df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad" Apr 17 17:39:28.010204 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.010150 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad"} err="failed to get container status \"df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad\": rpc error: code = NotFound desc = could not find container \"df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad\": container with ID starting with df531296870cbcda54eff271248fce34cf1874d19ce678284e3b11fcd3cbb3ad not found: ID does not exist" Apr 17 17:39:28.010204 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.010166 2565 scope.go:117] "RemoveContainer" containerID="122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7" Apr 17 17:39:28.010409 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:39:28.010393 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7\": container with ID starting with 122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7 not found: ID does not exist" containerID="122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7" Apr 17 17:39:28.010456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.010413 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7"} err="failed to get container status \"122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7\": rpc error: code = NotFound desc = could not find container \"122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7\": container with ID starting with 122cfeaa9deb2bf0058cbb2a4347923c677a1121d3ad391c38d3d642cf9a62e7 not found: ID does not exist" Apr 17 17:39:28.070086 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070060 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kserve-provision-location\") pod \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " Apr 17 17:39:28.070219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070095 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-tmp\") pod \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " Apr 17 17:39:28.070219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070168 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9275d\" (UniqueName: \"kubernetes.io/projected/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kube-api-access-9275d\") pod \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " Apr 17 17:39:28.070219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070185 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tls-certs\") pod \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " Apr 17 17:39:28.070219 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070207 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-cache\") pod \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " Apr 17 17:39:28.070475 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070223 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-uds\") pod \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\" (UID: \"2ce3e895-69ab-4fa8-a8f7-10a0284b04be\") " Apr 17 17:39:28.070569 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070514 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2ce3e895-69ab-4fa8-a8f7-10a0284b04be" (UID: "2ce3e895-69ab-4fa8-a8f7-10a0284b04be"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:28.070805 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070543 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2ce3e895-69ab-4fa8-a8f7-10a0284b04be" (UID: "2ce3e895-69ab-4fa8-a8f7-10a0284b04be"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:28.070805 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070648 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2ce3e895-69ab-4fa8-a8f7-10a0284b04be" (UID: "2ce3e895-69ab-4fa8-a8f7-10a0284b04be"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:28.070949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.070927 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2ce3e895-69ab-4fa8-a8f7-10a0284b04be" (UID: "2ce3e895-69ab-4fa8-a8f7-10a0284b04be"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:39:28.072297 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.072275 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kube-api-access-9275d" (OuterVolumeSpecName: "kube-api-access-9275d") pod "2ce3e895-69ab-4fa8-a8f7-10a0284b04be" (UID: "2ce3e895-69ab-4fa8-a8f7-10a0284b04be"). InnerVolumeSpecName "kube-api-access-9275d". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:39:28.072379 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.072323 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2ce3e895-69ab-4fa8-a8f7-10a0284b04be" (UID: "2ce3e895-69ab-4fa8-a8f7-10a0284b04be"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:39:28.171539 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.171503 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9275d\" (UniqueName: \"kubernetes.io/projected/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kube-api-access-9275d\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:39:28.171539 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.171532 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:39:28.171539 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.171546 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:39:28.171769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.171555 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:39:28.171769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.171565 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:39:28.171769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.171574 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2ce3e895-69ab-4fa8-a8f7-10a0284b04be-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:39:28.310479 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.310449 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59"] Apr 17 17:39:28.314012 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:28.313985 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schemph59"] Apr 17 17:39:29.952017 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:29.951983 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" path="/var/lib/kubelet/pods/2ce3e895-69ab-4fa8-a8f7-10a0284b04be/volumes" Apr 17 17:39:32.628157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.628120 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl"] Apr 17 17:39:32.628994 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.628970 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="main" Apr 17 17:39:32.628994 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.628994 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="main" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629015 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da708f25-ddc9-404d-ae37-6476da026dff" containerName="storage-initializer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629024 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="da708f25-ddc9-404d-ae37-6476da026dff" containerName="storage-initializer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629050 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="storage-initializer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629060 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="storage-initializer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629078 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="storage-initializer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629092 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="storage-initializer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629105 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="tokenizer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629113 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="tokenizer" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629122 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="main" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629131 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="main" Apr 17 17:39:32.629150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629155 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="tokenizer" Apr 17 17:39:32.629649 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629163 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="tokenizer" Apr 17 17:39:32.629649 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629187 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da708f25-ddc9-404d-ae37-6476da026dff" containerName="main" Apr 17 17:39:32.629649 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629195 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="da708f25-ddc9-404d-ae37-6476da026dff" containerName="main" Apr 17 17:39:32.629649 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629642 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="tokenizer" Apr 17 17:39:32.629821 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629662 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="main" Apr 17 17:39:32.629821 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629671 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2ce3e895-69ab-4fa8-a8f7-10a0284b04be" containerName="tokenizer" Apr 17 17:39:32.629821 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629687 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="83f307fa-447a-4d45-997d-e5d2fb4cfd25" containerName="main" Apr 17 17:39:32.629821 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.629704 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="da708f25-ddc9-404d-ae37-6476da026dff" containerName="main" Apr 17 17:39:32.637414 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.637391 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.642421 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.642400 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-bs7rs\"" Apr 17 17:39:32.643198 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.643178 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 17 17:39:32.643478 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.643216 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:39:32.646674 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.646651 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl"] Apr 17 17:39:32.811549 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.811516 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80392d80-3d8b-4f2a-a091-79922e82709b-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.811724 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.811570 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.811724 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.811638 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.811724 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.811707 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v9dd2\" (UniqueName: \"kubernetes.io/projected/80392d80-3d8b-4f2a-a091-79922e82709b-kube-api-access-v9dd2\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.811824 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.811758 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.811824 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.811780 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.912561 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.912518 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80392d80-3d8b-4f2a-a091-79922e82709b-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.912723 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.912587 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.912723 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.912625 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.912723 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.912656 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v9dd2\" (UniqueName: \"kubernetes.io/projected/80392d80-3d8b-4f2a-a091-79922e82709b-kube-api-access-v9dd2\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.912723 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.912708 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.912886 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.912737 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.913112 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.913089 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.913220 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.913125 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.913220 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.913158 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.913220 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.913191 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.914992 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.914974 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80392d80-3d8b-4f2a-a091-79922e82709b-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.920865 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.920846 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v9dd2\" (UniqueName: \"kubernetes.io/projected/80392d80-3d8b-4f2a-a091-79922e82709b-kube-api-access-v9dd2\") pod \"custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:32.947994 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:32.947967 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:33.074386 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:33.074360 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl"] Apr 17 17:39:33.076475 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:39:33.076445 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod80392d80_3d8b_4f2a_a091_79922e82709b.slice/crio-03475e3a8a0f7f804aba414cd2f56e13bcc8e983e77bba63d7c681f8a3098b49 WatchSource:0}: Error finding container 03475e3a8a0f7f804aba414cd2f56e13bcc8e983e77bba63d7c681f8a3098b49: Status 404 returned error can't find the container with id 03475e3a8a0f7f804aba414cd2f56e13bcc8e983e77bba63d7c681f8a3098b49 Apr 17 17:39:33.078235 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:33.078219 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:39:34.010578 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:34.010543 2565 generic.go:358] "Generic (PLEG): container finished" podID="80392d80-3d8b-4f2a-a091-79922e82709b" containerID="e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f" exitCode=0 Apr 17 17:39:34.011034 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:34.010631 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerDied","Data":"e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f"} Apr 17 17:39:34.011034 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:34.010674 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerStarted","Data":"03475e3a8a0f7f804aba414cd2f56e13bcc8e983e77bba63d7c681f8a3098b49"} Apr 17 17:39:35.016461 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:35.016420 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerStarted","Data":"69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1"} Apr 17 17:39:35.016461 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:35.016464 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerStarted","Data":"d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800"} Apr 17 17:39:35.017006 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:35.016491 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:35.051703 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:35.051650 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" podStartSLOduration=3.05163415 podStartE2EDuration="3.05163415s" podCreationTimestamp="2026-04-17 17:39:32 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:39:35.051024405 +0000 UTC m=+915.709068715" watchObservedRunningTime="2026-04-17 17:39:35.05163415 +0000 UTC m=+915.709678460" Apr 17 17:39:42.948307 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:42.948261 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:42.948307 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:42.948304 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:42.951133 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:42.951107 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:39:43.044409 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:39:43.044378 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:40:04.048190 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:40:04.048158 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:41:28.865488 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:28.865453 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl"] Apr 17 17:41:28.866285 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:28.865977 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="main" containerID="cri-o://d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800" gracePeriod=30 Apr 17 17:41:28.866586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:28.865978 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="tokenizer" containerID="cri-o://69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1" gracePeriod=30 Apr 17 17:41:29.387873 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:29.387838 2565 generic.go:358] "Generic (PLEG): container finished" podID="80392d80-3d8b-4f2a-a091-79922e82709b" containerID="d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800" exitCode=0 Apr 17 17:41:29.388052 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:29.387891 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerDied","Data":"d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800"} Apr 17 17:41:30.128001 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.127978 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:41:30.226035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.225952 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-uds\") pod \"80392d80-3d8b-4f2a-a091-79922e82709b\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " Apr 17 17:41:30.226035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226000 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v9dd2\" (UniqueName: \"kubernetes.io/projected/80392d80-3d8b-4f2a-a091-79922e82709b-kube-api-access-v9dd2\") pod \"80392d80-3d8b-4f2a-a091-79922e82709b\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " Apr 17 17:41:30.226035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226032 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-cache\") pod \"80392d80-3d8b-4f2a-a091-79922e82709b\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " Apr 17 17:41:30.226347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226064 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-tmp\") pod \"80392d80-3d8b-4f2a-a091-79922e82709b\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " Apr 17 17:41:30.226347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226087 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-kserve-provision-location\") pod \"80392d80-3d8b-4f2a-a091-79922e82709b\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " Apr 17 17:41:30.226347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226214 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80392d80-3d8b-4f2a-a091-79922e82709b-tls-certs\") pod \"80392d80-3d8b-4f2a-a091-79922e82709b\" (UID: \"80392d80-3d8b-4f2a-a091-79922e82709b\") " Apr 17 17:41:30.226347 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226297 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "80392d80-3d8b-4f2a-a091-79922e82709b" (UID: "80392d80-3d8b-4f2a-a091-79922e82709b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:30.226554 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226350 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "80392d80-3d8b-4f2a-a091-79922e82709b" (UID: "80392d80-3d8b-4f2a-a091-79922e82709b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:30.226554 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226476 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "80392d80-3d8b-4f2a-a091-79922e82709b" (UID: "80392d80-3d8b-4f2a-a091-79922e82709b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:30.226554 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226532 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:41:30.226708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226560 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:41:30.226881 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.226849 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "80392d80-3d8b-4f2a-a091-79922e82709b" (UID: "80392d80-3d8b-4f2a-a091-79922e82709b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:41:30.228313 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.228224 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/80392d80-3d8b-4f2a-a091-79922e82709b-kube-api-access-v9dd2" (OuterVolumeSpecName: "kube-api-access-v9dd2") pod "80392d80-3d8b-4f2a-a091-79922e82709b" (UID: "80392d80-3d8b-4f2a-a091-79922e82709b"). InnerVolumeSpecName "kube-api-access-v9dd2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:41:30.228313 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.228305 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/80392d80-3d8b-4f2a-a091-79922e82709b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "80392d80-3d8b-4f2a-a091-79922e82709b" (UID: "80392d80-3d8b-4f2a-a091-79922e82709b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:41:30.326945 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.326893 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:41:30.326945 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.326941 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/80392d80-3d8b-4f2a-a091-79922e82709b-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:41:30.326945 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.326954 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/80392d80-3d8b-4f2a-a091-79922e82709b-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:41:30.327188 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.326973 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v9dd2\" (UniqueName: \"kubernetes.io/projected/80392d80-3d8b-4f2a-a091-79922e82709b-kube-api-access-v9dd2\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:41:30.393134 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.393100 2565 generic.go:358] "Generic (PLEG): container finished" podID="80392d80-3d8b-4f2a-a091-79922e82709b" containerID="69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1" exitCode=0 Apr 17 17:41:30.393325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.393172 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerDied","Data":"69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1"} Apr 17 17:41:30.393325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.393179 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" Apr 17 17:41:30.393325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.393198 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl" event={"ID":"80392d80-3d8b-4f2a-a091-79922e82709b","Type":"ContainerDied","Data":"03475e3a8a0f7f804aba414cd2f56e13bcc8e983e77bba63d7c681f8a3098b49"} Apr 17 17:41:30.393325 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.393213 2565 scope.go:117] "RemoveContainer" containerID="69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1" Apr 17 17:41:30.401925 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.401908 2565 scope.go:117] "RemoveContainer" containerID="d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800" Apr 17 17:41:30.409039 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.409020 2565 scope.go:117] "RemoveContainer" containerID="e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f" Apr 17 17:41:30.416636 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.416610 2565 scope.go:117] "RemoveContainer" containerID="69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1" Apr 17 17:41:30.416937 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:41:30.416911 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1\": container with ID starting with 69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1 not found: ID does not exist" containerID="69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1" Apr 17 17:41:30.417033 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.416949 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1"} err="failed to get container status \"69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1\": rpc error: code = NotFound desc = could not find container \"69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1\": container with ID starting with 69bfab4e66220836fe7b7639ad73fd69d2202491feb9daeee27177c510fb66b1 not found: ID does not exist" Apr 17 17:41:30.417033 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.416978 2565 scope.go:117] "RemoveContainer" containerID="d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800" Apr 17 17:41:30.417150 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.417127 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl"] Apr 17 17:41:30.417267 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:41:30.417229 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800\": container with ID starting with d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800 not found: ID does not exist" containerID="d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800" Apr 17 17:41:30.417319 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.417273 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800"} err="failed to get container status \"d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800\": rpc error: code = NotFound desc = could not find container \"d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800\": container with ID starting with d9b4baf436f5d898d787f1ae12e5fafc8572ff1ee8fb6fdeddd332b2b008e800 not found: ID does not exist" Apr 17 17:41:30.417319 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.417292 2565 scope.go:117] "RemoveContainer" containerID="e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f" Apr 17 17:41:30.417538 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:41:30.417518 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f\": container with ID starting with e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f not found: ID does not exist" containerID="e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f" Apr 17 17:41:30.417574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.417545 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f"} err="failed to get container status \"e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f\": rpc error: code = NotFound desc = could not find container \"e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f\": container with ID starting with e4aae839de86bcbe6925f5a20487b24236bc344caf6040fb225725a5afe9744f not found: ID does not exist" Apr 17 17:41:30.424044 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:30.424019 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6fd4d8bddhwhl"] Apr 17 17:41:31.955697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:31.955660 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" path="/var/lib/kubelet/pods/80392d80-3d8b-4f2a-a091-79922e82709b/volumes" Apr 17 17:41:47.994906 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.994870 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4"] Apr 17 17:41:47.995309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995211 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="main" Apr 17 17:41:47.995309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995222 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="main" Apr 17 17:41:47.995309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995234 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="storage-initializer" Apr 17 17:41:47.995309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995255 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="storage-initializer" Apr 17 17:41:47.995309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995267 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="tokenizer" Apr 17 17:41:47.995309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995276 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="tokenizer" Apr 17 17:41:47.995499 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995339 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="tokenizer" Apr 17 17:41:47.995499 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.995348 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="80392d80-3d8b-4f2a-a091-79922e82709b" containerName="main" Apr 17 17:41:47.998457 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:47.998437 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.001096 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.001075 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 17 17:41:48.001843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.001828 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:41:48.001933 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.001912 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-dxd7c\"" Apr 17 17:41:48.008870 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.008847 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4"] Apr 17 17:41:48.072531 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.072486 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.072531 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.072534 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.072777 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.072612 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.072777 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.072682 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.072777 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.072733 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.072777 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.072769 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r7vq\" (UniqueName: \"kubernetes.io/projected/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kube-api-access-7r7vq\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173194 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173165 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173378 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173210 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r7vq\" (UniqueName: \"kubernetes.io/projected/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kube-api-access-7r7vq\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173378 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173276 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173498 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173420 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173498 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173481 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173605 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173514 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173605 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173575 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173710 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173618 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173798 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173778 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.173910 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.173892 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.175738 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.175720 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.181553 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.181518 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r7vq\" (UniqueName: \"kubernetes.io/projected/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kube-api-access-7r7vq\") pod \"router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.309159 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.309064 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:48.654800 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:48.654771 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4"] Apr 17 17:41:48.657904 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:41:48.657870 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod005e025d_c7c5_47ea_8d3d_97e02f50e84b.slice/crio-d1c92ec43887aac4ae95b4a9a77d7e91a23bd802cdb623805ce29c3679b9b63b WatchSource:0}: Error finding container d1c92ec43887aac4ae95b4a9a77d7e91a23bd802cdb623805ce29c3679b9b63b: Status 404 returned error can't find the container with id d1c92ec43887aac4ae95b4a9a77d7e91a23bd802cdb623805ce29c3679b9b63b Apr 17 17:41:49.455651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:49.455608 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerStarted","Data":"5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce"} Apr 17 17:41:49.455651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:49.455653 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerStarted","Data":"d1c92ec43887aac4ae95b4a9a77d7e91a23bd802cdb623805ce29c3679b9b63b"} Apr 17 17:41:50.460122 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:50.460084 2565 generic.go:358] "Generic (PLEG): container finished" podID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerID="5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce" exitCode=0 Apr 17 17:41:50.460525 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:50.460160 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerDied","Data":"5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce"} Apr 17 17:41:51.465471 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:51.465438 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerStarted","Data":"cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0"} Apr 17 17:41:51.465471 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:51.465473 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerStarted","Data":"dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef"} Apr 17 17:41:51.465876 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:51.465622 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:51.498835 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:51.498784 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" podStartSLOduration=4.49876772 podStartE2EDuration="4.49876772s" podCreationTimestamp="2026-04-17 17:41:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:41:51.497456627 +0000 UTC m=+1052.155500934" watchObservedRunningTime="2026-04-17 17:41:51.49876772 +0000 UTC m=+1052.156812027" Apr 17 17:41:58.309446 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:58.309404 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:58.309869 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:58.309457 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:58.312285 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:58.312229 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:41:58.493593 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:41:58.493562 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:42:29.498039 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:42:29.498007 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:43:24.823408 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:24.823367 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv"] Apr 17 17:43:24.823921 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:24.823662 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" podUID="c81ffb9f-ffd0-4087-822a-614a206ea1fb" containerName="manager" containerID="cri-o://62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9" gracePeriod=30 Apr 17 17:43:25.067687 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.067661 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:43:25.198447 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.198413 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xb5ss\" (UniqueName: \"kubernetes.io/projected/c81ffb9f-ffd0-4087-822a-614a206ea1fb-kube-api-access-xb5ss\") pod \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " Apr 17 17:43:25.198621 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.198476 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c81ffb9f-ffd0-4087-822a-614a206ea1fb-cert\") pod \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\" (UID: \"c81ffb9f-ffd0-4087-822a-614a206ea1fb\") " Apr 17 17:43:25.200537 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.200506 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c81ffb9f-ffd0-4087-822a-614a206ea1fb-cert" (OuterVolumeSpecName: "cert") pod "c81ffb9f-ffd0-4087-822a-614a206ea1fb" (UID: "c81ffb9f-ffd0-4087-822a-614a206ea1fb"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:25.200656 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.200597 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c81ffb9f-ffd0-4087-822a-614a206ea1fb-kube-api-access-xb5ss" (OuterVolumeSpecName: "kube-api-access-xb5ss") pod "c81ffb9f-ffd0-4087-822a-614a206ea1fb" (UID: "c81ffb9f-ffd0-4087-822a-614a206ea1fb"). InnerVolumeSpecName "kube-api-access-xb5ss". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:25.299238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.299199 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xb5ss\" (UniqueName: \"kubernetes.io/projected/c81ffb9f-ffd0-4087-822a-614a206ea1fb-kube-api-access-xb5ss\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:25.299238 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.299230 2565 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c81ffb9f-ffd0-4087-822a-614a206ea1fb-cert\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:25.784564 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.784531 2565 generic.go:358] "Generic (PLEG): container finished" podID="c81ffb9f-ffd0-4087-822a-614a206ea1fb" containerID="62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9" exitCode=0 Apr 17 17:43:25.784731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.784604 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" event={"ID":"c81ffb9f-ffd0-4087-822a-614a206ea1fb","Type":"ContainerDied","Data":"62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9"} Apr 17 17:43:25.784731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.784630 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" event={"ID":"c81ffb9f-ffd0-4087-822a-614a206ea1fb","Type":"ContainerDied","Data":"c11a58f4ee77ce1e642acffe018f6224ed2d88d09a07a109719daebc47314f2b"} Apr 17 17:43:25.784731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.784644 2565 scope.go:117] "RemoveContainer" containerID="62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9" Apr 17 17:43:25.789270 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.785026 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv" Apr 17 17:43:25.799129 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.799110 2565 scope.go:117] "RemoveContainer" containerID="62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9" Apr 17 17:43:25.799393 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:43:25.799375 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9\": container with ID starting with 62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9 not found: ID does not exist" containerID="62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9" Apr 17 17:43:25.799437 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.799403 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9"} err="failed to get container status \"62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9\": rpc error: code = NotFound desc = could not find container \"62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9\": container with ID starting with 62712defd89d3fc6e2cbf4384d467f94684172083283c610ecc7944de18287b9 not found: ID does not exist" Apr 17 17:43:25.808129 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.808104 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv"] Apr 17 17:43:25.812489 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.812466 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-9d4bc98dc-8kmrv"] Apr 17 17:43:25.952192 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:25.952154 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c81ffb9f-ffd0-4087-822a-614a206ea1fb" path="/var/lib/kubelet/pods/c81ffb9f-ffd0-4087-822a-614a206ea1fb/volumes" Apr 17 17:43:49.340391 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:49.340236 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4"] Apr 17 17:43:49.340939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:49.340601 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="main" containerID="cri-o://dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef" gracePeriod=30 Apr 17 17:43:49.340939 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:49.340681 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="tokenizer" containerID="cri-o://cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0" gracePeriod=30 Apr 17 17:43:49.496267 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:43:49.496224 2565 logging.go:55] [core] [Channel #243 SubChannel #244]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.41:9003", ServerName: "10.134.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.41:9003: connect: connection refused" Apr 17 17:43:49.866653 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:49.866617 2565 generic.go:358] "Generic (PLEG): container finished" podID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerID="dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef" exitCode=0 Apr 17 17:43:49.866821 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:49.866695 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerDied","Data":"dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef"} Apr 17 17:43:50.496416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.496372 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.41:9003\" within 1s: context deadline exceeded" Apr 17 17:43:50.597922 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.597900 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:43:50.721890 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.721857 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kserve-provision-location\") pod \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " Apr 17 17:43:50.721890 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.721895 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r7vq\" (UniqueName: \"kubernetes.io/projected/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kube-api-access-7r7vq\") pod \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " Apr 17 17:43:50.722101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.721998 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-uds\") pod \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " Apr 17 17:43:50.722101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722021 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-tmp\") pod \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " Apr 17 17:43:50.722101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722057 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tls-certs\") pod \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " Apr 17 17:43:50.722233 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722099 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-cache\") pod \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\" (UID: \"005e025d-c7c5-47ea-8d3d-97e02f50e84b\") " Apr 17 17:43:50.722372 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722342 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "005e025d-c7c5-47ea-8d3d-97e02f50e84b" (UID: "005e025d-c7c5-47ea-8d3d-97e02f50e84b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:50.722455 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722435 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "005e025d-c7c5-47ea-8d3d-97e02f50e84b" (UID: "005e025d-c7c5-47ea-8d3d-97e02f50e84b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:50.722576 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722559 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "005e025d-c7c5-47ea-8d3d-97e02f50e84b" (UID: "005e025d-c7c5-47ea-8d3d-97e02f50e84b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:50.722879 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.722852 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "005e025d-c7c5-47ea-8d3d-97e02f50e84b" (UID: "005e025d-c7c5-47ea-8d3d-97e02f50e84b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:43:50.724311 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.724213 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kube-api-access-7r7vq" (OuterVolumeSpecName: "kube-api-access-7r7vq") pod "005e025d-c7c5-47ea-8d3d-97e02f50e84b" (UID: "005e025d-c7c5-47ea-8d3d-97e02f50e84b"). InnerVolumeSpecName "kube-api-access-7r7vq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:43:50.724311 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.724231 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "005e025d-c7c5-47ea-8d3d-97e02f50e84b" (UID: "005e025d-c7c5-47ea-8d3d-97e02f50e84b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:43:50.823148 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.823109 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:50.823148 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.823142 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:50.823148 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.823152 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:50.823148 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.823160 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:50.823445 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.823170 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:50.823445 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.823180 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7r7vq\" (UniqueName: \"kubernetes.io/projected/005e025d-c7c5-47ea-8d3d-97e02f50e84b-kube-api-access-7r7vq\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:43:50.871554 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.871518 2565 generic.go:358] "Generic (PLEG): container finished" podID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerID="cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0" exitCode=0 Apr 17 17:43:50.871693 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.871572 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerDied","Data":"cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0"} Apr 17 17:43:50.871693 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.871614 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" event={"ID":"005e025d-c7c5-47ea-8d3d-97e02f50e84b","Type":"ContainerDied","Data":"d1c92ec43887aac4ae95b4a9a77d7e91a23bd802cdb623805ce29c3679b9b63b"} Apr 17 17:43:50.871693 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.871621 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4" Apr 17 17:43:50.871693 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.871630 2565 scope.go:117] "RemoveContainer" containerID="cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0" Apr 17 17:43:50.880494 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.880479 2565 scope.go:117] "RemoveContainer" containerID="dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef" Apr 17 17:43:50.887441 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.887427 2565 scope.go:117] "RemoveContainer" containerID="5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce" Apr 17 17:43:50.893404 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.893381 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4"] Apr 17 17:43:50.894964 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.894951 2565 scope.go:117] "RemoveContainer" containerID="cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0" Apr 17 17:43:50.895215 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:43:50.895197 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0\": container with ID starting with cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0 not found: ID does not exist" containerID="cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0" Apr 17 17:43:50.895293 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.895222 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0"} err="failed to get container status \"cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0\": rpc error: code = NotFound desc = could not find container \"cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0\": container with ID starting with cebb7eabcc03c766e531e736309aff475625f8eb85bb7ac445e2a93fcf6fb4c0 not found: ID does not exist" Apr 17 17:43:50.895293 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.895252 2565 scope.go:117] "RemoveContainer" containerID="dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef" Apr 17 17:43:50.895481 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:43:50.895463 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef\": container with ID starting with dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef not found: ID does not exist" containerID="dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef" Apr 17 17:43:50.895523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.895489 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef"} err="failed to get container status \"dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef\": rpc error: code = NotFound desc = could not find container \"dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef\": container with ID starting with dcca7cf244fc7802f4ccfa1b2c0eb505c476161c662bd1f9d4d06206f22b24ef not found: ID does not exist" Apr 17 17:43:50.895523 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.895507 2565 scope.go:117] "RemoveContainer" containerID="5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce" Apr 17 17:43:50.895776 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:43:50.895732 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce\": container with ID starting with 5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce not found: ID does not exist" containerID="5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce" Apr 17 17:43:50.895776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.895751 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce"} err="failed to get container status \"5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce\": rpc error: code = NotFound desc = could not find container \"5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce\": container with ID starting with 5a4251a906ffd5eb793ffffa92339869a48594cb8b98d4e64f4be5d84c17c8ce not found: ID does not exist" Apr 17 17:43:50.898561 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:50.898539 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-86597cdffd-wdxt4"] Apr 17 17:43:51.951702 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:43:51.951669 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" path="/var/lib/kubelet/pods/005e025d-c7c5-47ea-8d3d-97e02f50e84b/volumes" Apr 17 17:44:09.233731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.233684 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89"] Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234031 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c81ffb9f-ffd0-4087-822a-614a206ea1fb" containerName="manager" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234042 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="c81ffb9f-ffd0-4087-822a-614a206ea1fb" containerName="manager" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234057 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="storage-initializer" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234063 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="storage-initializer" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234075 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="tokenizer" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234081 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="tokenizer" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234093 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="main" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234099 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="main" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234149 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="c81ffb9f-ffd0-4087-822a-614a206ea1fb" containerName="manager" Apr 17 17:44:09.234154 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234158 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="main" Apr 17 17:44:09.234514 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.234165 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="005e025d-c7c5-47ea-8d3d-97e02f50e84b" containerName="tokenizer" Apr 17 17:44:09.238645 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.238625 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.241317 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.241293 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-g4rjk\"" Apr 17 17:44:09.241876 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.241860 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-rf529\"" Apr 17 17:44:09.241970 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.241880 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 17 17:44:09.248697 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.248674 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89"] Apr 17 17:44:09.383678 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.383637 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6qhlv\" (UniqueName: \"kubernetes.io/projected/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kube-api-access-6qhlv\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.383678 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.383682 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.383935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.383708 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.383935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.383817 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.383935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.383927 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.384056 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.383962 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.485323 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.485197 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.485323 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.485282 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6qhlv\" (UniqueName: \"kubernetes.io/projected/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kube-api-access-6qhlv\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.485572 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.485325 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.485572 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.485358 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.485572 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.485435 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.485572 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.485532 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.486039 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.486015 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.486209 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.486069 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.486209 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.486149 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.486409 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.486354 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.488358 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.488337 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.494822 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.494802 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6qhlv\" (UniqueName: \"kubernetes.io/projected/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kube-api-access-6qhlv\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.548971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.548927 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:09.674111 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.674078 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89"] Apr 17 17:44:09.677788 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:44:09.677760 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda546b81e_640c_499b_9dd1_d0c7af89fbb9.slice/crio-ded95ff964c2da251cf9c3745d232e8b8d7ebd007dc5716958e22b67bb0b18f3 WatchSource:0}: Error finding container ded95ff964c2da251cf9c3745d232e8b8d7ebd007dc5716958e22b67bb0b18f3: Status 404 returned error can't find the container with id ded95ff964c2da251cf9c3745d232e8b8d7ebd007dc5716958e22b67bb0b18f3 Apr 17 17:44:09.934591 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.934553 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerStarted","Data":"7cd8aafa554b1e24cbbe312149878d7a3ee5d985f8e10b3f691c4af90d520154"} Apr 17 17:44:09.934591 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:09.934594 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerStarted","Data":"ded95ff964c2da251cf9c3745d232e8b8d7ebd007dc5716958e22b67bb0b18f3"} Apr 17 17:44:10.938419 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:10.938382 2565 generic.go:358] "Generic (PLEG): container finished" podID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerID="7cd8aafa554b1e24cbbe312149878d7a3ee5d985f8e10b3f691c4af90d520154" exitCode=0 Apr 17 17:44:10.938836 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:10.938434 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerDied","Data":"7cd8aafa554b1e24cbbe312149878d7a3ee5d985f8e10b3f691c4af90d520154"} Apr 17 17:44:11.944470 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:11.944432 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerStarted","Data":"7a4bd3c2c9880695a7b864acd3432f106b744b6201917b6b8d02795f008df4e5"} Apr 17 17:44:11.944998 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:11.944973 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerStarted","Data":"5ba38a0af32e803336f6507bfba0845f0baac3f878cc825082893ecbbf60876b"} Apr 17 17:44:11.945128 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:11.945115 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:11.967666 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:11.967594 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" podStartSLOduration=2.967574001 podStartE2EDuration="2.967574001s" podCreationTimestamp="2026-04-17 17:44:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:44:11.966674126 +0000 UTC m=+1192.624718436" watchObservedRunningTime="2026-04-17 17:44:11.967574001 +0000 UTC m=+1192.625618307" Apr 17 17:44:19.549631 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:19.549595 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:19.549631 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:19.549634 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:19.552330 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:19.552304 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:19.973049 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:19.973026 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:44:50.977287 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:44:50.977233 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:47:19.373183 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.373146 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 17:47:19.377107 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.377083 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.380360 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.380337 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-ck8l2\"" Apr 17 17:47:19.380486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.380380 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 17 17:47:19.385419 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.385393 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 17:47:19.424437 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.424398 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.424616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.424454 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6s8wk\" (UniqueName: \"kubernetes.io/projected/b1d27800-1b3a-42a3-a1ec-0756fb290961-kube-api-access-6s8wk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.424616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.424478 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d27800-1b3a-42a3-a1ec-0756fb290961-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.424616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.424531 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.424616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.424549 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.424616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.424573 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525209 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525171 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525406 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525213 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525406 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525284 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525406 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525383 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525559 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525444 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6s8wk\" (UniqueName: \"kubernetes.io/projected/b1d27800-1b3a-42a3-a1ec-0756fb290961-kube-api-access-6s8wk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525559 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525469 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d27800-1b3a-42a3-a1ec-0756fb290961-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525661 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525639 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525755 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525734 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.525819 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.525796 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.527643 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.527613 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.527820 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.527803 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d27800-1b3a-42a3-a1ec-0756fb290961-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.535155 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.535133 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6s8wk\" (UniqueName: \"kubernetes.io/projected/b1d27800-1b3a-42a3-a1ec-0756fb290961-kube-api-access-6s8wk\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.688804 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.688774 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:47:19.813449 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.813421 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 17:47:19.814999 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:47:19.814966 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1d27800_1b3a_42a3_a1ec_0756fb290961.slice/crio-d9d6f5135640982bddbd09b139d49d215e0e29bac771fc57445332c2c055a531 WatchSource:0}: Error finding container d9d6f5135640982bddbd09b139d49d215e0e29bac771fc57445332c2c055a531: Status 404 returned error can't find the container with id d9d6f5135640982bddbd09b139d49d215e0e29bac771fc57445332c2c055a531 Apr 17 17:47:19.817096 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:19.817080 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:47:20.571304 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:20.571262 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b1d27800-1b3a-42a3-a1ec-0756fb290961","Type":"ContainerStarted","Data":"5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a"} Apr 17 17:47:20.571304 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:20.571308 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b1d27800-1b3a-42a3-a1ec-0756fb290961","Type":"ContainerStarted","Data":"d9d6f5135640982bddbd09b139d49d215e0e29bac771fc57445332c2c055a531"} Apr 17 17:47:24.586340 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:24.586302 2565 generic.go:358] "Generic (PLEG): container finished" podID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerID="5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a" exitCode=0 Apr 17 17:47:24.586717 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:24.586379 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b1d27800-1b3a-42a3-a1ec-0756fb290961","Type":"ContainerDied","Data":"5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a"} Apr 17 17:47:30.051139 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:30.050032 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89"] Apr 17 17:47:30.051139 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:30.050460 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="main" containerID="cri-o://5ba38a0af32e803336f6507bfba0845f0baac3f878cc825082893ecbbf60876b" gracePeriod=30 Apr 17 17:47:30.051139 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:30.050848 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="tokenizer" containerID="cri-o://7a4bd3c2c9880695a7b864acd3432f106b744b6201917b6b8d02795f008df4e5" gracePeriod=30 Apr 17 17:47:30.621308 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:30.621269 2565 generic.go:358] "Generic (PLEG): container finished" podID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerID="5ba38a0af32e803336f6507bfba0845f0baac3f878cc825082893ecbbf60876b" exitCode=0 Apr 17 17:47:30.621501 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:30.621371 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerDied","Data":"5ba38a0af32e803336f6507bfba0845f0baac3f878cc825082893ecbbf60876b"} Apr 17 17:47:30.976070 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:47:30.976029 2565 logging.go:55] [core] [Channel #353 SubChannel #354]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.42:9003", ServerName: "10.134.0.42:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.42:9003: connect: connection refused" Apr 17 17:47:31.976270 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:31.976072 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.42:9003\" within 1s: context deadline exceeded" Apr 17 17:47:34.655666 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.655020 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc"] Apr 17 17:47:34.722971 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.722934 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc"] Apr 17 17:47:34.723155 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.723128 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.725802 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.725779 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-pjh9l\"" Apr 17 17:47:34.725802 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.725797 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 17 17:47:34.773330 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.773290 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.773512 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.773433 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acaa07a3-e325-40e2-b713-8c4aa881db64-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.773512 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.773475 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.773628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.773519 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmghc\" (UniqueName: \"kubernetes.io/projected/acaa07a3-e325-40e2-b713-8c4aa881db64-kube-api-access-fmghc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.773628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.773552 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.773628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.773590 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.874411 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874377 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acaa07a3-e325-40e2-b713-8c4aa881db64-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.874411 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874412 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.874830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874449 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmghc\" (UniqueName: \"kubernetes.io/projected/acaa07a3-e325-40e2-b713-8c4aa881db64-kube-api-access-fmghc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.874830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874475 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.874830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874518 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.874830 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874569 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.875031 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874862 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.875031 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.874964 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.875101 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.875072 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.875133 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.875118 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.877544 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.877515 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acaa07a3-e325-40e2-b713-8c4aa881db64-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:34.885134 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:34.885084 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmghc\" (UniqueName: \"kubernetes.io/projected/acaa07a3-e325-40e2-b713-8c4aa881db64-kube-api-access-fmghc\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:35.036540 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:35.036461 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:35.648163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:35.648124 2565 generic.go:358] "Generic (PLEG): container finished" podID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerID="7a4bd3c2c9880695a7b864acd3432f106b744b6201917b6b8d02795f008df4e5" exitCode=0 Apr 17 17:47:35.648343 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:35.648296 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerDied","Data":"7a4bd3c2c9880695a7b864acd3432f106b744b6201917b6b8d02795f008df4e5"} Apr 17 17:47:36.849492 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.849459 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:47:36.894429 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894387 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-cache\") pod \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " Apr 17 17:47:36.894586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894441 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-tmp\") pod \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " Apr 17 17:47:36.894586 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894566 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kserve-provision-location\") pod \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " Apr 17 17:47:36.894726 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894617 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tls-certs\") pod \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " Apr 17 17:47:36.894726 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894642 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-uds\") pod \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " Apr 17 17:47:36.894726 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894682 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6qhlv\" (UniqueName: \"kubernetes.io/projected/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kube-api-access-6qhlv\") pod \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\" (UID: \"a546b81e-640c-499b-9dd1-d0c7af89fbb9\") " Apr 17 17:47:36.894929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.894674 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "a546b81e-640c-499b-9dd1-d0c7af89fbb9" (UID: "a546b81e-640c-499b-9dd1-d0c7af89fbb9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:36.895353 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.895327 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a546b81e-640c-499b-9dd1-d0c7af89fbb9" (UID: "a546b81e-640c-499b-9dd1-d0c7af89fbb9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:36.895943 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.895735 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "a546b81e-640c-499b-9dd1-d0c7af89fbb9" (UID: "a546b81e-640c-499b-9dd1-d0c7af89fbb9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:36.895943 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.895850 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "a546b81e-640c-499b-9dd1-d0c7af89fbb9" (UID: "a546b81e-640c-499b-9dd1-d0c7af89fbb9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:47:36.897735 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.897394 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a546b81e-640c-499b-9dd1-d0c7af89fbb9" (UID: "a546b81e-640c-499b-9dd1-d0c7af89fbb9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:47:36.898205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.898180 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kube-api-access-6qhlv" (OuterVolumeSpecName: "kube-api-access-6qhlv") pod "a546b81e-640c-499b-9dd1-d0c7af89fbb9" (UID: "a546b81e-640c-499b-9dd1-d0c7af89fbb9"). InnerVolumeSpecName "kube-api-access-6qhlv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:47:36.996436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.996349 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:47:36.996436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.996386 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:47:36.996436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.996406 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:47:36.996436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.996420 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6qhlv\" (UniqueName: \"kubernetes.io/projected/a546b81e-640c-499b-9dd1-d0c7af89fbb9-kube-api-access-6qhlv\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:47:36.996436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.996432 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:47:36.996436 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:36.996445 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/a546b81e-640c-499b-9dd1-d0c7af89fbb9-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:47:37.211634 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.211536 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc"] Apr 17 17:47:37.212721 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:47:37.212670 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacaa07a3_e325_40e2_b713_8c4aa881db64.slice/crio-5ef198c90a6753b516a2a60fadfdb7acf7ed59e98976291a5388465f151041e0 WatchSource:0}: Error finding container 5ef198c90a6753b516a2a60fadfdb7acf7ed59e98976291a5388465f151041e0: Status 404 returned error can't find the container with id 5ef198c90a6753b516a2a60fadfdb7acf7ed59e98976291a5388465f151041e0 Apr 17 17:47:37.659440 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.659409 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" Apr 17 17:47:37.659616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.659402 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89" event={"ID":"a546b81e-640c-499b-9dd1-d0c7af89fbb9","Type":"ContainerDied","Data":"ded95ff964c2da251cf9c3745d232e8b8d7ebd007dc5716958e22b67bb0b18f3"} Apr 17 17:47:37.659616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.659582 2565 scope.go:117] "RemoveContainer" containerID="7a4bd3c2c9880695a7b864acd3432f106b744b6201917b6b8d02795f008df4e5" Apr 17 17:47:37.661419 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.661378 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerStarted","Data":"67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857"} Apr 17 17:47:37.661537 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.661421 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerStarted","Data":"5ef198c90a6753b516a2a60fadfdb7acf7ed59e98976291a5388465f151041e0"} Apr 17 17:47:37.670331 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.670310 2565 scope.go:117] "RemoveContainer" containerID="5ba38a0af32e803336f6507bfba0845f0baac3f878cc825082893ecbbf60876b" Apr 17 17:47:37.679142 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.679106 2565 scope.go:117] "RemoveContainer" containerID="7cd8aafa554b1e24cbbe312149878d7a3ee5d985f8e10b3f691c4af90d520154" Apr 17 17:47:37.699748 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.699720 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89"] Apr 17 17:47:37.703418 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.703370 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schepnl89"] Apr 17 17:47:37.954270 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:37.954162 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" path="/var/lib/kubelet/pods/a546b81e-640c-499b-9dd1-d0c7af89fbb9/volumes" Apr 17 17:47:38.667514 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:38.667478 2565 generic.go:358] "Generic (PLEG): container finished" podID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerID="67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857" exitCode=0 Apr 17 17:47:38.667685 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:38.667522 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerDied","Data":"67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857"} Apr 17 17:47:39.672872 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:39.672830 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerStarted","Data":"0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad"} Apr 17 17:47:51.722893 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:51.722855 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerStarted","Data":"edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b"} Apr 17 17:47:51.723306 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:51.723038 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:51.743151 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:51.743099 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" podStartSLOduration=17.743081335 podStartE2EDuration="17.743081335s" podCreationTimestamp="2026-04-17 17:47:34 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:47:51.741334517 +0000 UTC m=+1412.399378826" watchObservedRunningTime="2026-04-17 17:47:51.743081335 +0000 UTC m=+1412.401125644" Apr 17 17:47:52.727976 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:52.727933 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b1d27800-1b3a-42a3-a1ec-0756fb290961","Type":"ContainerStarted","Data":"13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae"} Apr 17 17:47:52.757451 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:52.757386 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.735831084 podStartE2EDuration="33.757370243s" podCreationTimestamp="2026-04-17 17:47:19 +0000 UTC" firstStartedPulling="2026-04-17 17:47:24.587616884 +0000 UTC m=+1385.245661170" lastFinishedPulling="2026-04-17 17:47:51.60915604 +0000 UTC m=+1412.267200329" observedRunningTime="2026-04-17 17:47:52.755615427 +0000 UTC m=+1413.413659736" watchObservedRunningTime="2026-04-17 17:47:52.757370243 +0000 UTC m=+1413.415414550" Apr 17 17:47:55.036784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:55.036743 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:47:55.037297 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:47:55.036841 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:48:05.038926 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:48:05.038893 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:48:05.040158 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:48:05.040129 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:48:15.775336 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:48:15.775306 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:50:07.318117 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:07.318079 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 17:50:07.320919 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:07.318426 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerName="main" containerID="cri-o://13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae" gracePeriod=30 Apr 17 17:50:08.090653 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.090630 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:50:08.157819 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.157780 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-kserve-provision-location\") pod \"b1d27800-1b3a-42a3-a1ec-0756fb290961\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " Apr 17 17:50:08.158007 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.157918 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d27800-1b3a-42a3-a1ec-0756fb290961-tls-certs\") pod \"b1d27800-1b3a-42a3-a1ec-0756fb290961\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " Apr 17 17:50:08.158007 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.157949 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6s8wk\" (UniqueName: \"kubernetes.io/projected/b1d27800-1b3a-42a3-a1ec-0756fb290961-kube-api-access-6s8wk\") pod \"b1d27800-1b3a-42a3-a1ec-0756fb290961\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " Apr 17 17:50:08.158007 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.157976 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-dshm\") pod \"b1d27800-1b3a-42a3-a1ec-0756fb290961\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " Apr 17 17:50:08.158161 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.158009 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-home\") pod \"b1d27800-1b3a-42a3-a1ec-0756fb290961\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " Apr 17 17:50:08.158161 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.158060 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-model-cache\") pod \"b1d27800-1b3a-42a3-a1ec-0756fb290961\" (UID: \"b1d27800-1b3a-42a3-a1ec-0756fb290961\") " Apr 17 17:50:08.158430 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.158398 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-model-cache" (OuterVolumeSpecName: "model-cache") pod "b1d27800-1b3a-42a3-a1ec-0756fb290961" (UID: "b1d27800-1b3a-42a3-a1ec-0756fb290961"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:08.158556 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.158470 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-home" (OuterVolumeSpecName: "home") pod "b1d27800-1b3a-42a3-a1ec-0756fb290961" (UID: "b1d27800-1b3a-42a3-a1ec-0756fb290961"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:08.160215 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.160182 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1d27800-1b3a-42a3-a1ec-0756fb290961-kube-api-access-6s8wk" (OuterVolumeSpecName: "kube-api-access-6s8wk") pod "b1d27800-1b3a-42a3-a1ec-0756fb290961" (UID: "b1d27800-1b3a-42a3-a1ec-0756fb290961"). InnerVolumeSpecName "kube-api-access-6s8wk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:50:08.160591 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.160574 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1d27800-1b3a-42a3-a1ec-0756fb290961-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b1d27800-1b3a-42a3-a1ec-0756fb290961" (UID: "b1d27800-1b3a-42a3-a1ec-0756fb290961"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:50:08.160689 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.160654 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-dshm" (OuterVolumeSpecName: "dshm") pod "b1d27800-1b3a-42a3-a1ec-0756fb290961" (UID: "b1d27800-1b3a-42a3-a1ec-0756fb290961"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:08.200442 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.200403 2565 generic.go:358] "Generic (PLEG): container finished" podID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerID="13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae" exitCode=0 Apr 17 17:50:08.200601 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.200452 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b1d27800-1b3a-42a3-a1ec-0756fb290961","Type":"ContainerDied","Data":"13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae"} Apr 17 17:50:08.200601 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.200477 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 17 17:50:08.200601 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.200487 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b1d27800-1b3a-42a3-a1ec-0756fb290961","Type":"ContainerDied","Data":"d9d6f5135640982bddbd09b139d49d215e0e29bac771fc57445332c2c055a531"} Apr 17 17:50:08.200601 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.200507 2565 scope.go:117] "RemoveContainer" containerID="13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae" Apr 17 17:50:08.213051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.213023 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b1d27800-1b3a-42a3-a1ec-0756fb290961" (UID: "b1d27800-1b3a-42a3-a1ec-0756fb290961"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:08.219392 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.219373 2565 scope.go:117] "RemoveContainer" containerID="5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a" Apr 17 17:50:08.259811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.259777 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6s8wk\" (UniqueName: \"kubernetes.io/projected/b1d27800-1b3a-42a3-a1ec-0756fb290961-kube-api-access-6s8wk\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:08.259811 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.259809 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-dshm\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:08.259999 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.259822 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-home\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:08.259999 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.259834 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-model-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:08.259999 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.259849 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1d27800-1b3a-42a3-a1ec-0756fb290961-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:08.259999 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.259861 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1d27800-1b3a-42a3-a1ec-0756fb290961-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:08.278271 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.278232 2565 scope.go:117] "RemoveContainer" containerID="13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae" Apr 17 17:50:08.278563 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:50:08.278543 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae\": container with ID starting with 13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae not found: ID does not exist" containerID="13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae" Apr 17 17:50:08.278632 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.278572 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae"} err="failed to get container status \"13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae\": rpc error: code = NotFound desc = could not find container \"13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae\": container with ID starting with 13dcec338ac6f41298fb88407dbe1cae86bed6127a4ba65e9a0e90f513ef1dae not found: ID does not exist" Apr 17 17:50:08.278632 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.278589 2565 scope.go:117] "RemoveContainer" containerID="5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a" Apr 17 17:50:08.278832 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:50:08.278814 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a\": container with ID starting with 5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a not found: ID does not exist" containerID="5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a" Apr 17 17:50:08.278887 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.278844 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a"} err="failed to get container status \"5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a\": rpc error: code = NotFound desc = could not find container \"5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a\": container with ID starting with 5b79d0a2d23accf0f33c1e504b0676614dd3ba191625375dcb239f08f231b86a not found: ID does not exist" Apr 17 17:50:08.526956 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.526926 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 17:50:08.534924 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:08.534896 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 17 17:50:09.951576 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:09.951543 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" path="/var/lib/kubelet/pods/b1d27800-1b3a-42a3-a1ec-0756fb290961/volumes" Apr 17 17:50:39.447356 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447320 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4"] Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447733 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="main" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447747 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="main" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447761 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerName="main" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447767 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerName="main" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447774 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="storage-initializer" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447779 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="storage-initializer" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447786 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="tokenizer" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447791 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="tokenizer" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447802 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerName="storage-initializer" Apr 17 17:50:39.447843 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447808 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerName="storage-initializer" Apr 17 17:50:39.448212 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447864 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="main" Apr 17 17:50:39.448212 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447873 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1d27800-1b3a-42a3-a1ec-0756fb290961" containerName="main" Apr 17 17:50:39.448212 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.447881 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="a546b81e-640c-499b-9dd1-d0c7af89fbb9" containerName="tokenizer" Apr 17 17:50:39.451999 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.451981 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.454538 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.454516 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 17 17:50:39.460624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.460597 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4"] Apr 17 17:50:39.528978 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.528945 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-dshm\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.529163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.529008 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-home\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.529163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.529033 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-tls-certs\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.529163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.529067 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-model-cache\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.529163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.529092 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bbrdm\" (UniqueName: \"kubernetes.io/projected/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kube-api-access-bbrdm\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.529349 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.529176 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.630676 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.630639 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.630850 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.630720 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-dshm\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.630850 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.630818 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-home\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.630850 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.630837 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-tls-certs\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.630964 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.630858 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-model-cache\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.631045 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.631024 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bbrdm\" (UniqueName: \"kubernetes.io/projected/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kube-api-access-bbrdm\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.631108 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.631056 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.631194 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.631172 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-home\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.631265 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.631212 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-model-cache\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.632929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.632907 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-dshm\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.633194 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.633176 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-tls-certs\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.640022 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.640005 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bbrdm\" (UniqueName: \"kubernetes.io/projected/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kube-api-access-bbrdm\") pod \"scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.763402 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.763315 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:39.889059 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:39.889031 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4"] Apr 17 17:50:39.891561 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:50:39.891530 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a44e95_7aae_42bc_8fa3_b779fa49c0bc.slice/crio-65675957d43fff575947f7c673b22f002c6eab444665c5cb515ab1d23bc27c0d WatchSource:0}: Error finding container 65675957d43fff575947f7c673b22f002c6eab444665c5cb515ab1d23bc27c0d: Status 404 returned error can't find the container with id 65675957d43fff575947f7c673b22f002c6eab444665c5cb515ab1d23bc27c0d Apr 17 17:50:40.305869 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:40.305825 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" event={"ID":"54a44e95-7aae-42bc-8fa3-b779fa49c0bc","Type":"ContainerStarted","Data":"12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713"} Apr 17 17:50:40.305869 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:40.305872 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" event={"ID":"54a44e95-7aae-42bc-8fa3-b779fa49c0bc","Type":"ContainerStarted","Data":"65675957d43fff575947f7c673b22f002c6eab444665c5cb515ab1d23bc27c0d"} Apr 17 17:50:44.110278 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:50:44.108128 2565 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a44e95_7aae_42bc_8fa3_b779fa49c0bc.slice/crio-12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod54a44e95_7aae_42bc_8fa3_b779fa49c0bc.slice/crio-conmon-12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713.scope\": RecentStats: unable to find data in memory cache]" Apr 17 17:50:44.325420 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:44.325378 2565 generic.go:358] "Generic (PLEG): container finished" podID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerID="12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713" exitCode=0 Apr 17 17:50:44.325588 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:44.325455 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" event={"ID":"54a44e95-7aae-42bc-8fa3-b779fa49c0bc","Type":"ContainerDied","Data":"12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713"} Apr 17 17:50:45.331083 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:45.331047 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" event={"ID":"54a44e95-7aae-42bc-8fa3-b779fa49c0bc","Type":"ContainerStarted","Data":"f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b"} Apr 17 17:50:45.351012 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:45.350952 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" podStartSLOduration=6.350933201 podStartE2EDuration="6.350933201s" podCreationTimestamp="2026-04-17 17:50:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:50:45.349421553 +0000 UTC m=+1586.007465887" watchObservedRunningTime="2026-04-17 17:50:45.350933201 +0000 UTC m=+1586.008977509" Apr 17 17:50:49.763867 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:49.763830 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:49.763867 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:49.763874 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:49.776478 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:49.776454 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:50.358368 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:50.358341 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:50:53.701847 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:53.701803 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc"] Apr 17 17:50:53.702398 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:53.702225 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="main" containerID="cri-o://0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad" gracePeriod=30 Apr 17 17:50:53.702667 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:53.702641 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="tokenizer" containerID="cri-o://edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b" gracePeriod=30 Apr 17 17:50:54.361907 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:54.361866 2565 generic.go:358] "Generic (PLEG): container finished" podID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerID="0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad" exitCode=0 Apr 17 17:50:54.362083 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:54.361935 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerDied","Data":"0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad"} Apr 17 17:50:55.077351 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.077037 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:50:55.180960 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.180926 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-tmp\") pod \"acaa07a3-e325-40e2-b713-8c4aa881db64\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " Apr 17 17:50:55.181157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.180971 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-cache\") pod \"acaa07a3-e325-40e2-b713-8c4aa881db64\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " Apr 17 17:50:55.181157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181001 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-kserve-provision-location\") pod \"acaa07a3-e325-40e2-b713-8c4aa881db64\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " Apr 17 17:50:55.181157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181053 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acaa07a3-e325-40e2-b713-8c4aa881db64-tls-certs\") pod \"acaa07a3-e325-40e2-b713-8c4aa881db64\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " Apr 17 17:50:55.181157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181093 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-uds\") pod \"acaa07a3-e325-40e2-b713-8c4aa881db64\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " Apr 17 17:50:55.181157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181121 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-fmghc\" (UniqueName: \"kubernetes.io/projected/acaa07a3-e325-40e2-b713-8c4aa881db64-kube-api-access-fmghc\") pod \"acaa07a3-e325-40e2-b713-8c4aa881db64\" (UID: \"acaa07a3-e325-40e2-b713-8c4aa881db64\") " Apr 17 17:50:55.181397 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181213 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "acaa07a3-e325-40e2-b713-8c4aa881db64" (UID: "acaa07a3-e325-40e2-b713-8c4aa881db64"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:55.181397 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181271 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "acaa07a3-e325-40e2-b713-8c4aa881db64" (UID: "acaa07a3-e325-40e2-b713-8c4aa881db64"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:55.181486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181430 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "acaa07a3-e325-40e2-b713-8c4aa881db64" (UID: "acaa07a3-e325-40e2-b713-8c4aa881db64"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:55.181486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181442 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:55.181486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181465 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:55.181796 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.181748 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "acaa07a3-e325-40e2-b713-8c4aa881db64" (UID: "acaa07a3-e325-40e2-b713-8c4aa881db64"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:50:55.183329 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.183284 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acaa07a3-e325-40e2-b713-8c4aa881db64-kube-api-access-fmghc" (OuterVolumeSpecName: "kube-api-access-fmghc") pod "acaa07a3-e325-40e2-b713-8c4aa881db64" (UID: "acaa07a3-e325-40e2-b713-8c4aa881db64"). InnerVolumeSpecName "kube-api-access-fmghc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:50:55.183517 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.183407 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acaa07a3-e325-40e2-b713-8c4aa881db64-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "acaa07a3-e325-40e2-b713-8c4aa881db64" (UID: "acaa07a3-e325-40e2-b713-8c4aa881db64"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:50:55.282265 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.282205 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:55.282265 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.282266 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acaa07a3-e325-40e2-b713-8c4aa881db64-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:55.282481 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.282285 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/acaa07a3-e325-40e2-b713-8c4aa881db64-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:55.282481 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.282299 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-fmghc\" (UniqueName: \"kubernetes.io/projected/acaa07a3-e325-40e2-b713-8c4aa881db64-kube-api-access-fmghc\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:50:55.368462 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.368419 2565 generic.go:358] "Generic (PLEG): container finished" podID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerID="edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b" exitCode=0 Apr 17 17:50:55.368700 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.368555 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" Apr 17 17:50:55.368700 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.368559 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerDied","Data":"edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b"} Apr 17 17:50:55.368700 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.368607 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" event={"ID":"acaa07a3-e325-40e2-b713-8c4aa881db64","Type":"ContainerDied","Data":"5ef198c90a6753b516a2a60fadfdb7acf7ed59e98976291a5388465f151041e0"} Apr 17 17:50:55.368700 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.368631 2565 scope.go:117] "RemoveContainer" containerID="edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b" Apr 17 17:50:55.381968 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.381571 2565 scope.go:117] "RemoveContainer" containerID="0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad" Apr 17 17:50:55.389745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.389727 2565 scope.go:117] "RemoveContainer" containerID="67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857" Apr 17 17:50:55.395851 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.395829 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc"] Apr 17 17:50:55.398919 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.398898 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc"] Apr 17 17:50:55.399719 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.399693 2565 scope.go:117] "RemoveContainer" containerID="edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b" Apr 17 17:50:55.399960 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:50:55.399943 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b\": container with ID starting with edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b not found: ID does not exist" containerID="edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b" Apr 17 17:50:55.400022 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.399970 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b"} err="failed to get container status \"edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b\": rpc error: code = NotFound desc = could not find container \"edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b\": container with ID starting with edbd9f87617e3fec2d241e03582e78c0f432413eaf2c88612b3839e3d37b191b not found: ID does not exist" Apr 17 17:50:55.400022 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.399989 2565 scope.go:117] "RemoveContainer" containerID="0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad" Apr 17 17:50:55.400283 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:50:55.400266 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad\": container with ID starting with 0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad not found: ID does not exist" containerID="0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad" Apr 17 17:50:55.400353 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.400289 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad"} err="failed to get container status \"0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad\": rpc error: code = NotFound desc = could not find container \"0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad\": container with ID starting with 0fa260f2df75b7c496abb11c6df61a22d8c3620568220fc9e2abfa1dfd2248ad not found: ID does not exist" Apr 17 17:50:55.400353 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.400305 2565 scope.go:117] "RemoveContainer" containerID="67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857" Apr 17 17:50:55.400552 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:50:55.400534 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857\": container with ID starting with 67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857 not found: ID does not exist" containerID="67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857" Apr 17 17:50:55.400598 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.400560 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857"} err="failed to get container status \"67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857\": rpc error: code = NotFound desc = could not find container \"67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857\": container with ID starting with 67c56c7bf5235a4a637bff3d1744530622ae1d29cfad35f3d2fc4122dd69e857 not found: ID does not exist" Apr 17 17:50:55.952103 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:50:55.952070 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" path="/var/lib/kubelet/pods/acaa07a3-e325-40e2-b713-8c4aa881db64/volumes" Apr 17 17:51:00.040418 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:00.040370 2565 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-796b4plmdc" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.44:8082/healthz\": context deadline exceeded (Client.Timeout exceeded while awaiting headers)" Apr 17 17:51:22.828286 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828232 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv"] Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828602 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="main" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828614 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="main" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828625 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="storage-initializer" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828633 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="storage-initializer" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828641 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="tokenizer" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828648 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="tokenizer" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828703 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="tokenizer" Apr 17 17:51:22.828722 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.828712 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="acaa07a3-e325-40e2-b713-8c4aa881db64" containerName="main" Apr 17 17:51:22.832661 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.832632 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:22.835136 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.835109 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 17 17:51:22.835299 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.835139 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-qqt4n\"" Apr 17 17:51:22.845399 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.845353 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv"] Apr 17 17:51:22.919527 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.919490 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vk6m5\" (UniqueName: \"kubernetes.io/projected/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kube-api-access-vk6m5\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:22.919527 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.919528 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e40adf7-9206-4b48-8e7b-2f5ef4822196-tls-certs\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:22.919745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.919559 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-home\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:22.919745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.919573 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-dshm\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:22.919745 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.919614 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:22.919864 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:22.919734 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-model-cache\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020304 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020266 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020489 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020319 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-model-cache\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020489 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020380 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vk6m5\" (UniqueName: \"kubernetes.io/projected/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kube-api-access-vk6m5\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020516 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e40adf7-9206-4b48-8e7b-2f5ef4822196-tls-certs\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020567 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-home\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020593 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-dshm\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020628 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020620 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020812 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020700 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-model-cache\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.020913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.020893 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-home\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.022776 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.022753 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-dshm\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.023184 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.023163 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e40adf7-9206-4b48-8e7b-2f5ef4822196-tls-certs\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.029197 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.029173 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vk6m5\" (UniqueName: \"kubernetes.io/projected/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kube-api-access-vk6m5\") pod \"router-with-refs-pd-test-kserve-6c54b96944-9l5gv\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.144923 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.144883 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:23.375708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.375670 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2"] Apr 17 17:51:23.380585 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.380562 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.382640 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.382607 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-9cbww\"" Apr 17 17:51:23.391616 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.391591 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2"] Apr 17 17:51:23.423590 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.423508 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.423590 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.423551 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.423769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.423606 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.423769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.423633 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.423769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.423670 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2df51df7-842a-4ce9-80d8-db314e9656ea-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.423769 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.423691 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-465jc\" (UniqueName: \"kubernetes.io/projected/2df51df7-842a-4ce9-80d8-db314e9656ea-kube-api-access-465jc\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.485638 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.485611 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv"] Apr 17 17:51:23.487510 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:51:23.487476 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2e40adf7_9206_4b48_8e7b_2f5ef4822196.slice/crio-45d1b638513a243c6a364d664fe6bb32ead039cd1576962e046de45430d8d138 WatchSource:0}: Error finding container 45d1b638513a243c6a364d664fe6bb32ead039cd1576962e046de45430d8d138: Status 404 returned error can't find the container with id 45d1b638513a243c6a364d664fe6bb32ead039cd1576962e046de45430d8d138 Apr 17 17:51:23.524751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.524714 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.524942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.524771 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.524942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.524812 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.524942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.524839 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.524942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.524924 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2df51df7-842a-4ce9-80d8-db314e9656ea-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.525177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.524975 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-465jc\" (UniqueName: \"kubernetes.io/projected/2df51df7-842a-4ce9-80d8-db314e9656ea-kube-api-access-465jc\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.525231 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.525178 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.525304 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.525274 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.525304 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.525291 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.525376 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.525324 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.527378 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.527358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2df51df7-842a-4ce9-80d8-db314e9656ea-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.533080 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.533058 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-465jc\" (UniqueName: \"kubernetes.io/projected/2df51df7-842a-4ce9-80d8-db314e9656ea-kube-api-access-465jc\") pod \"router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.693421 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.693326 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:23.835313 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:23.835287 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2"] Apr 17 17:51:23.837661 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:51:23.837632 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2df51df7_842a_4ce9_80d8_db314e9656ea.slice/crio-8906d8a0f932cec93003ff97041b5ad4b7043af42d0730c3514603f49546faae WatchSource:0}: Error finding container 8906d8a0f932cec93003ff97041b5ad4b7043af42d0730c3514603f49546faae: Status 404 returned error can't find the container with id 8906d8a0f932cec93003ff97041b5ad4b7043af42d0730c3514603f49546faae Apr 17 17:51:24.468912 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:24.468873 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerStarted","Data":"cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56"} Apr 17 17:51:24.469115 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:24.468917 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerStarted","Data":"8906d8a0f932cec93003ff97041b5ad4b7043af42d0730c3514603f49546faae"} Apr 17 17:51:24.470380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:24.470347 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerStarted","Data":"45d1b638513a243c6a364d664fe6bb32ead039cd1576962e046de45430d8d138"} Apr 17 17:51:25.475161 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:25.475120 2565 generic.go:358] "Generic (PLEG): container finished" podID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerID="cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56" exitCode=0 Apr 17 17:51:25.475668 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:25.475208 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerDied","Data":"cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56"} Apr 17 17:51:25.476622 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:25.476597 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerStarted","Data":"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df"} Apr 17 17:51:25.476731 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:25.476709 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:26.483372 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:26.483328 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerStarted","Data":"b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679"} Apr 17 17:51:26.483840 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:26.483389 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerStarted","Data":"355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a"} Apr 17 17:51:26.483840 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:26.483507 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:26.485284 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:26.485230 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerStarted","Data":"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263"} Apr 17 17:51:26.507228 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:26.507169 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" podStartSLOduration=3.507150686 podStartE2EDuration="3.507150686s" podCreationTimestamp="2026-04-17 17:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:51:26.504529926 +0000 UTC m=+1627.162574233" watchObservedRunningTime="2026-04-17 17:51:26.507150686 +0000 UTC m=+1627.165194995" Apr 17 17:51:30.501474 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:30.501438 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerID="4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263" exitCode=0 Apr 17 17:51:30.501897 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:30.501514 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerDied","Data":"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263"} Apr 17 17:51:31.511643 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:31.511609 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerStarted","Data":"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f"} Apr 17 17:51:31.537557 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:31.537488 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podStartSLOduration=8.646518659 podStartE2EDuration="9.537466805s" podCreationTimestamp="2026-04-17 17:51:22 +0000 UTC" firstStartedPulling="2026-04-17 17:51:23.489634499 +0000 UTC m=+1624.147678786" lastFinishedPulling="2026-04-17 17:51:24.380582642 +0000 UTC m=+1625.038626932" observedRunningTime="2026-04-17 17:51:31.534963108 +0000 UTC m=+1632.193007443" watchObservedRunningTime="2026-04-17 17:51:31.537466805 +0000 UTC m=+1632.195511114" Apr 17 17:51:32.402610 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.402571 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4"] Apr 17 17:51:32.402974 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.402938 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerName="main" containerID="cri-o://f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b" gracePeriod=30 Apr 17 17:51:32.666568 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.666500 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:51:32.714778 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.714739 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bbrdm\" (UniqueName: \"kubernetes.io/projected/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kube-api-access-bbrdm\") pod \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " Apr 17 17:51:32.714949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.714797 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-home\") pod \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " Apr 17 17:51:32.714949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.714831 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-tls-certs\") pod \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " Apr 17 17:51:32.714949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.714869 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-dshm\") pod \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " Apr 17 17:51:32.714949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.714893 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-model-cache\") pod \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " Apr 17 17:51:32.714949 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.714942 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kserve-provision-location\") pod \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\" (UID: \"54a44e95-7aae-42bc-8fa3-b779fa49c0bc\") " Apr 17 17:51:32.715283 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.715109 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-home" (OuterVolumeSpecName: "home") pod "54a44e95-7aae-42bc-8fa3-b779fa49c0bc" (UID: "54a44e95-7aae-42bc-8fa3-b779fa49c0bc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:32.715350 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.715284 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-home\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:51:32.716063 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.716034 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-model-cache" (OuterVolumeSpecName: "model-cache") pod "54a44e95-7aae-42bc-8fa3-b779fa49c0bc" (UID: "54a44e95-7aae-42bc-8fa3-b779fa49c0bc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:32.717591 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.717552 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kube-api-access-bbrdm" (OuterVolumeSpecName: "kube-api-access-bbrdm") pod "54a44e95-7aae-42bc-8fa3-b779fa49c0bc" (UID: "54a44e95-7aae-42bc-8fa3-b779fa49c0bc"). InnerVolumeSpecName "kube-api-access-bbrdm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:51:32.717591 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.717567 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-dshm" (OuterVolumeSpecName: "dshm") pod "54a44e95-7aae-42bc-8fa3-b779fa49c0bc" (UID: "54a44e95-7aae-42bc-8fa3-b779fa49c0bc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:32.717856 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.717830 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "54a44e95-7aae-42bc-8fa3-b779fa49c0bc" (UID: "54a44e95-7aae-42bc-8fa3-b779fa49c0bc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:51:32.779295 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.779205 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "54a44e95-7aae-42bc-8fa3-b779fa49c0bc" (UID: "54a44e95-7aae-42bc-8fa3-b779fa49c0bc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:51:32.816035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.815994 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-dshm\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:51:32.816035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.816031 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-model-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:51:32.816255 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.816046 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:51:32.816255 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.816063 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bbrdm\" (UniqueName: \"kubernetes.io/projected/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-kube-api-access-bbrdm\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:51:32.816255 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:32.816075 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/54a44e95-7aae-42bc-8fa3-b779fa49c0bc-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:51:33.145437 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.145394 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:33.145437 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.145437 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:33.146848 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.146811 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:51:33.520991 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.520902 2565 generic.go:358] "Generic (PLEG): container finished" podID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerID="f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b" exitCode=0 Apr 17 17:51:33.520991 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.520975 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" Apr 17 17:51:33.521170 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.520988 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" event={"ID":"54a44e95-7aae-42bc-8fa3-b779fa49c0bc","Type":"ContainerDied","Data":"f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b"} Apr 17 17:51:33.521170 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.521026 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4" event={"ID":"54a44e95-7aae-42bc-8fa3-b779fa49c0bc","Type":"ContainerDied","Data":"65675957d43fff575947f7c673b22f002c6eab444665c5cb515ab1d23bc27c0d"} Apr 17 17:51:33.521170 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.521045 2565 scope.go:117] "RemoveContainer" containerID="f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b" Apr 17 17:51:33.530322 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.530299 2565 scope.go:117] "RemoveContainer" containerID="12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713" Apr 17 17:51:33.545664 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.545636 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4"] Apr 17 17:51:33.546291 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.546220 2565 scope.go:117] "RemoveContainer" containerID="f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b" Apr 17 17:51:33.546600 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:51:33.546574 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b\": container with ID starting with f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b not found: ID does not exist" containerID="f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b" Apr 17 17:51:33.546695 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.546614 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b"} err="failed to get container status \"f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b\": rpc error: code = NotFound desc = could not find container \"f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b\": container with ID starting with f8b53b8a29918cd83408337744bec14b7aea1743e810f2d0b4bc0eb532484b7b not found: ID does not exist" Apr 17 17:51:33.546695 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.546641 2565 scope.go:117] "RemoveContainer" containerID="12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713" Apr 17 17:51:33.546986 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:51:33.546961 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713\": container with ID starting with 12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713 not found: ID does not exist" containerID="12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713" Apr 17 17:51:33.547079 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.546992 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713"} err="failed to get container status \"12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713\": rpc error: code = NotFound desc = could not find container \"12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713\": container with ID starting with 12443a2d5f396b7a9db210f1f046a2e57d99d4fccacf469fa2bc8987d9cb7713 not found: ID does not exist" Apr 17 17:51:33.548060 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.548037 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-6bf5fccf9c-pw9z4"] Apr 17 17:51:33.693466 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.693424 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:33.693929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.693595 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:33.696189 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.696167 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:33.952092 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:33.952053 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" path="/var/lib/kubelet/pods/54a44e95-7aae-42bc-8fa3-b779fa49c0bc/volumes" Apr 17 17:51:34.527649 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:34.527617 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:51:43.146310 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:43.146230 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:51:43.158927 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:43.158900 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:51:53.145877 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:53.145832 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:51:56.536019 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:51:56.535988 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:52:03.146050 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:52:03.145985 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:52:13.146051 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:52:13.145997 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:52:23.146361 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:52:23.146310 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:52:33.145550 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:52:33.145499 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:52:43.145355 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:52:43.145227 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:52:53.145323 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:52:53.145276 2565 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8001/health\": dial tcp 10.134.0.46:8001: connect: connection refused" Apr 17 17:53:03.155808 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:53:03.155767 2565 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:53:03.172473 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:53:03.172446 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:57:18.778378 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:18.778340 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2"] Apr 17 17:57:18.781270 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:18.778760 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="main" containerID="cri-o://355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a" gracePeriod=30 Apr 17 17:57:18.781270 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:18.778813 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="tokenizer" containerID="cri-o://b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679" gracePeriod=30 Apr 17 17:57:18.788714 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:18.788683 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv"] Apr 17 17:57:18.789227 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:18.789179 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" containerID="cri-o://3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f" gracePeriod=30 Apr 17 17:57:19.728559 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:19.728522 2565 generic.go:358] "Generic (PLEG): container finished" podID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerID="355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a" exitCode=0 Apr 17 17:57:19.728778 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:19.728586 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerDied","Data":"355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a"} Apr 17 17:57:20.036990 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.036960 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:57:20.080545 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.080513 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-tmp\") pod \"2df51df7-842a-4ce9-80d8-db314e9656ea\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " Apr 17 17:57:20.080705 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.080562 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-cache\") pod \"2df51df7-842a-4ce9-80d8-db314e9656ea\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " Apr 17 17:57:20.080853 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.080825 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2df51df7-842a-4ce9-80d8-db314e9656ea" (UID: "2df51df7-842a-4ce9-80d8-db314e9656ea"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:20.080915 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.080862 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2df51df7-842a-4ce9-80d8-db314e9656ea" (UID: "2df51df7-842a-4ce9-80d8-db314e9656ea"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:20.181675 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.181632 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2df51df7-842a-4ce9-80d8-db314e9656ea-tls-certs\") pod \"2df51df7-842a-4ce9-80d8-db314e9656ea\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " Apr 17 17:57:20.181875 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.181738 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-465jc\" (UniqueName: \"kubernetes.io/projected/2df51df7-842a-4ce9-80d8-db314e9656ea-kube-api-access-465jc\") pod \"2df51df7-842a-4ce9-80d8-db314e9656ea\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " Apr 17 17:57:20.181875 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.181760 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-kserve-provision-location\") pod \"2df51df7-842a-4ce9-80d8-db314e9656ea\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " Apr 17 17:57:20.181875 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.181782 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-uds\") pod \"2df51df7-842a-4ce9-80d8-db314e9656ea\" (UID: \"2df51df7-842a-4ce9-80d8-db314e9656ea\") " Apr 17 17:57:20.182052 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.181972 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-tmp\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:20.182052 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.181993 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:20.182147 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.182102 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2df51df7-842a-4ce9-80d8-db314e9656ea" (UID: "2df51df7-842a-4ce9-80d8-db314e9656ea"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:20.182574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.182550 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2df51df7-842a-4ce9-80d8-db314e9656ea" (UID: "2df51df7-842a-4ce9-80d8-db314e9656ea"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:20.183834 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.183808 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2df51df7-842a-4ce9-80d8-db314e9656ea-kube-api-access-465jc" (OuterVolumeSpecName: "kube-api-access-465jc") pod "2df51df7-842a-4ce9-80d8-db314e9656ea" (UID: "2df51df7-842a-4ce9-80d8-db314e9656ea"). InnerVolumeSpecName "kube-api-access-465jc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:57:20.183935 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.183883 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2df51df7-842a-4ce9-80d8-db314e9656ea-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2df51df7-842a-4ce9-80d8-db314e9656ea" (UID: "2df51df7-842a-4ce9-80d8-db314e9656ea"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:57:20.282508 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.282413 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-465jc\" (UniqueName: \"kubernetes.io/projected/2df51df7-842a-4ce9-80d8-db314e9656ea-kube-api-access-465jc\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:20.282508 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.282451 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:20.282508 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.282462 2565 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2df51df7-842a-4ce9-80d8-db314e9656ea-tokenizer-uds\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:20.282508 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.282471 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2df51df7-842a-4ce9-80d8-db314e9656ea-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:20.733398 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.733352 2565 generic.go:358] "Generic (PLEG): container finished" podID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerID="b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679" exitCode=0 Apr 17 17:57:20.733582 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.733420 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerDied","Data":"b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679"} Apr 17 17:57:20.733582 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.733448 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" event={"ID":"2df51df7-842a-4ce9-80d8-db314e9656ea","Type":"ContainerDied","Data":"8906d8a0f932cec93003ff97041b5ad4b7043af42d0730c3514603f49546faae"} Apr 17 17:57:20.733582 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.733457 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2" Apr 17 17:57:20.733582 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.733468 2565 scope.go:117] "RemoveContainer" containerID="b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679" Apr 17 17:57:20.742733 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.742715 2565 scope.go:117] "RemoveContainer" containerID="355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a" Apr 17 17:57:20.749948 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.749930 2565 scope.go:117] "RemoveContainer" containerID="cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56" Apr 17 17:57:20.757407 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.757386 2565 scope.go:117] "RemoveContainer" containerID="b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679" Apr 17 17:57:20.757806 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:57:20.757775 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679\": container with ID starting with b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679 not found: ID does not exist" containerID="b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679" Apr 17 17:57:20.757806 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.757815 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679"} err="failed to get container status \"b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679\": rpc error: code = NotFound desc = could not find container \"b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679\": container with ID starting with b640f7ac763aa174972747bf6b550392524cef34525f630cda98ca1636c20679 not found: ID does not exist" Apr 17 17:57:20.758018 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.757841 2565 scope.go:117] "RemoveContainer" containerID="355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a" Apr 17 17:57:20.758189 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:57:20.758130 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a\": container with ID starting with 355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a not found: ID does not exist" containerID="355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a" Apr 17 17:57:20.758189 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.758169 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a"} err="failed to get container status \"355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a\": rpc error: code = NotFound desc = could not find container \"355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a\": container with ID starting with 355c2cf35df68c882934d5b5bfe854e01579c63b86f0ef5e94c7728bf8bea59a not found: ID does not exist" Apr 17 17:57:20.758546 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.758192 2565 scope.go:117] "RemoveContainer" containerID="cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56" Apr 17 17:57:20.758546 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:57:20.758518 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56\": container with ID starting with cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56 not found: ID does not exist" containerID="cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56" Apr 17 17:57:20.758651 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.758544 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56"} err="failed to get container status \"cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56\": rpc error: code = NotFound desc = could not find container \"cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56\": container with ID starting with cc51bf6ca97a7a27d44b5d5461ae574b3ab7e71cae3ca53a754fa61efae05d56 not found: ID does not exist" Apr 17 17:57:20.759926 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.759903 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2"] Apr 17 17:57:20.762728 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:20.762697 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-d579c8cf-dfkt2"] Apr 17 17:57:21.952891 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:21.952852 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" path="/var/lib/kubelet/pods/2df51df7-842a-4ce9-80d8-db314e9656ea/volumes" Apr 17 17:57:34.110200 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:34.110164 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:34.205908 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:34.205873 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:34.221351 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:34.221319 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:34.235299 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:34.235272 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:35.268680 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:35.268646 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:35.318203 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:35.318164 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:35.331363 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:35.331338 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:35.348456 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:35.348433 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:36.361309 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:36.361276 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:36.412993 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:36.412964 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:36.423607 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:36.423553 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:36.434328 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:36.434301 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:37.418169 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:37.418103 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:37.467929 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:37.467892 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:37.482673 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:37.482645 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:37.499360 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:37.499328 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:38.524928 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:38.524899 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:38.591059 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:38.591031 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:38.605620 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:38.605591 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:38.625163 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:38.625141 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:39.631733 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:39.631701 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:39.686587 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:39.686555 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:39.696506 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:39.696482 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:39.712823 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:39.712802 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:40.809950 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:40.809918 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:40.862560 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:40.862525 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:40.875495 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:40.875459 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:40.891025 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:40.890998 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:41.966783 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:41.966748 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:42.024840 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:42.024807 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:42.054584 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:42.054549 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:42.080976 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:42.080950 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:43.158685 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:43.158656 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:43.206157 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:43.206127 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:43.219486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:43.219458 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:43.238263 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:43.238219 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:44.243171 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:44.243137 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:44.297227 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:44.297195 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:44.308300 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:44.308270 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:44.320649 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:44.320627 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:45.353104 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:45.353079 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:45.399440 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:45.399411 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:45.410775 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:45.410746 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:45.425229 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:45.425209 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:46.409485 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:46.409456 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:46.454416 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:46.454386 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:46.463798 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:46.463776 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:46.475715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:46.475686 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:47.463591 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:47.463555 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:47.548931 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:47.548900 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:47.562574 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:47.562546 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:47.576275 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:47.576238 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:48.599751 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:48.599723 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-gateway-1-openshift-default-6c59fbf55c-wkkpz_ad21ccfc-94b0-40d2-8013-58c9bfe91201/istio-proxy/0.log" Apr 17 17:57:48.649356 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:48.649316 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:48.664253 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:48.664224 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/llm-d-routing-sidecar/0.log" Apr 17 17:57:48.677546 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:48.677525 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/storage-initializer/0.log" Apr 17 17:57:48.790007 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:48.789947 2565 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="llm-d-routing-sidecar" containerID="cri-o://9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df" gracePeriod=2 Apr 17 17:57:49.042372 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.042349 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:49.043035 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.043017 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:57:49.124486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124397 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-model-cache\") pod \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " Apr 17 17:57:49.124486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124454 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-dshm\") pod \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " Apr 17 17:57:49.124486 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124480 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vk6m5\" (UniqueName: \"kubernetes.io/projected/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kube-api-access-vk6m5\") pod \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " Apr 17 17:57:49.124715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124524 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-home\") pod \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " Apr 17 17:57:49.124715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124546 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e40adf7-9206-4b48-8e7b-2f5ef4822196-tls-certs\") pod \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " Apr 17 17:57:49.124715 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124579 2565 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kserve-provision-location\") pod \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\" (UID: \"2e40adf7-9206-4b48-8e7b-2f5ef4822196\") " Apr 17 17:57:49.124850 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124696 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-model-cache" (OuterVolumeSpecName: "model-cache") pod "2e40adf7-9206-4b48-8e7b-2f5ef4822196" (UID: "2e40adf7-9206-4b48-8e7b-2f5ef4822196"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.124916 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124860 2565 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-model-cache\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.124998 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.124967 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-home" (OuterVolumeSpecName: "home") pod "2e40adf7-9206-4b48-8e7b-2f5ef4822196" (UID: "2e40adf7-9206-4b48-8e7b-2f5ef4822196"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.126690 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.126653 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2e40adf7-9206-4b48-8e7b-2f5ef4822196-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2e40adf7-9206-4b48-8e7b-2f5ef4822196" (UID: "2e40adf7-9206-4b48-8e7b-2f5ef4822196"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 17 17:57:49.126815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.126712 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kube-api-access-vk6m5" (OuterVolumeSpecName: "kube-api-access-vk6m5") pod "2e40adf7-9206-4b48-8e7b-2f5ef4822196" (UID: "2e40adf7-9206-4b48-8e7b-2f5ef4822196"). InnerVolumeSpecName "kube-api-access-vk6m5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 17 17:57:49.126815 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.126747 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-dshm" (OuterVolumeSpecName: "dshm") pod "2e40adf7-9206-4b48-8e7b-2f5ef4822196" (UID: "2e40adf7-9206-4b48-8e7b-2f5ef4822196"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.179729 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.179687 2565 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2e40adf7-9206-4b48-8e7b-2f5ef4822196" (UID: "2e40adf7-9206-4b48-8e7b-2f5ef4822196"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 17 17:57:49.225551 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.225522 2565 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-dshm\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.225551 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.225546 2565 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vk6m5\" (UniqueName: \"kubernetes.io/projected/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kube-api-access-vk6m5\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.225551 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.225557 2565 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-home\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.225766 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.225566 2565 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2e40adf7-9206-4b48-8e7b-2f5ef4822196-tls-certs\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.225766 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.225575 2565 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2e40adf7-9206-4b48-8e7b-2f5ef4822196-kserve-provision-location\") on node \"ip-10-0-133-87.ec2.internal\" DevicePath \"\"" Apr 17 17:57:49.732227 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.732194 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zwmhm_5abf0c93-54f4-4998-afdd-68635ef2572c/discovery/0.log" Apr 17 17:57:49.755782 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.755754 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-trgbv_d47dd0f0-1374-487a-a2c1-0251ed74d094/istio-proxy/0.log" Apr 17 17:57:49.838341 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.838311 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-6c54b96944-9l5gv_2e40adf7-9206-4b48-8e7b-2f5ef4822196/main/0.log" Apr 17 17:57:49.838925 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.838903 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerID="3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f" exitCode=137 Apr 17 17:57:49.838925 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.838924 2565 generic.go:358] "Generic (PLEG): container finished" podID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerID="9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df" exitCode=0 Apr 17 17:57:49.839037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.838980 2565 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" Apr 17 17:57:49.839037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.838991 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerDied","Data":"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f"} Apr 17 17:57:49.839037 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.839030 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerDied","Data":"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df"} Apr 17 17:57:49.839142 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.839040 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv" event={"ID":"2e40adf7-9206-4b48-8e7b-2f5ef4822196","Type":"ContainerDied","Data":"45d1b638513a243c6a364d664fe6bb32ead039cd1576962e046de45430d8d138"} Apr 17 17:57:49.839142 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.839054 2565 scope.go:117] "RemoveContainer" containerID="3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f" Apr 17 17:57:49.858055 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.858026 2565 scope.go:117] "RemoveContainer" containerID="4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263" Apr 17 17:57:49.865907 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.865881 2565 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv"] Apr 17 17:57:49.869930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.869908 2565 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-6c54b96944-9l5gv"] Apr 17 17:57:49.870335 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.870313 2565 scope.go:117] "RemoveContainer" containerID="9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df" Apr 17 17:57:49.877339 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.877318 2565 scope.go:117] "RemoveContainer" containerID="3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f" Apr 17 17:57:49.877622 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:57:49.877602 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f\": container with ID starting with 3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f not found: ID does not exist" containerID="3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f" Apr 17 17:57:49.877713 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.877634 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f"} err="failed to get container status \"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f\": rpc error: code = NotFound desc = could not find container \"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f\": container with ID starting with 3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f not found: ID does not exist" Apr 17 17:57:49.877713 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.877658 2565 scope.go:117] "RemoveContainer" containerID="4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263" Apr 17 17:57:49.877885 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:57:49.877870 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263\": container with ID starting with 4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263 not found: ID does not exist" containerID="4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263" Apr 17 17:57:49.877930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.877891 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263"} err="failed to get container status \"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263\": rpc error: code = NotFound desc = could not find container \"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263\": container with ID starting with 4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263 not found: ID does not exist" Apr 17 17:57:49.877930 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.877905 2565 scope.go:117] "RemoveContainer" containerID="9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df" Apr 17 17:57:49.878114 ip-10-0-133-87 kubenswrapper[2565]: E0417 17:57:49.878099 2565 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df\": container with ID starting with 9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df not found: ID does not exist" containerID="9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df" Apr 17 17:57:49.878182 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878123 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df"} err="failed to get container status \"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df\": rpc error: code = NotFound desc = could not find container \"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df\": container with ID starting with 9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df not found: ID does not exist" Apr 17 17:57:49.878182 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878156 2565 scope.go:117] "RemoveContainer" containerID="3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f" Apr 17 17:57:49.878465 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878440 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f"} err="failed to get container status \"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f\": rpc error: code = NotFound desc = could not find container \"3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f\": container with ID starting with 3a96078d7ad66c1040ecdf3cfc18438964e7637ff796af488e93e66d11f0374f not found: ID does not exist" Apr 17 17:57:49.878465 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878461 2565 scope.go:117] "RemoveContainer" containerID="4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263" Apr 17 17:57:49.878675 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878658 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263"} err="failed to get container status \"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263\": rpc error: code = NotFound desc = could not find container \"4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263\": container with ID starting with 4a6c38676c9b869d9d871dd3bd1557524a534ea5d0baeb3cd6477ec8bf157263 not found: ID does not exist" Apr 17 17:57:49.878727 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878676 2565 scope.go:117] "RemoveContainer" containerID="9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df" Apr 17 17:57:49.878895 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.878877 2565 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df"} err="failed to get container status \"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df\": rpc error: code = NotFound desc = could not find container \"9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df\": container with ID starting with 9dec57e783e67319c47fbb16b03c60ec0346ef272a1a5abdab7802cd3dbb88df not found: ID does not exist" Apr 17 17:57:49.953160 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:49.953126 2565 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" path="/var/lib/kubelet/pods/2e40adf7-9206-4b48-8e7b-2f5ef4822196/volumes" Apr 17 17:57:50.681149 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:50.681117 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zwmhm_5abf0c93-54f4-4998-afdd-68635ef2572c/discovery/0.log" Apr 17 17:57:50.706828 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:50.706803 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-trgbv_d47dd0f0-1374-487a-a2c1-0251ed74d094/istio-proxy/0.log" Apr 17 17:57:51.592793 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:51.592743 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-b7fn5_7a57d275-8f91-411b-b982-840a5f8635cf/manager/0.log" Apr 17 17:57:51.710958 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:51.710927 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-8gp5z_5369b1e8-120b-4804-977d-218d836aebdb/manager/0.log" Apr 17 17:57:56.910970 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:56.910941 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-9c4x9_5fd1192d-ea25-4070-ac07-30ae8b3bade1/global-pull-secret-syncer/0.log" Apr 17 17:57:57.123293 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:57.123265 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-pzhcg_490c70db-b612-47c1-a980-96f5b8cc2cbc/konnectivity-agent/0.log" Apr 17 17:57:57.157909 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:57:57.157870 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-87.ec2.internal_8cf3e3e8476c10b85dc36d1342e54de1/haproxy/0.log" Apr 17 17:58:01.112964 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:01.112935 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-b7fn5_7a57d275-8f91-411b-b982-840a5f8635cf/manager/0.log" Apr 17 17:58:01.365855 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:01.365772 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-8gp5z_5369b1e8-120b-4804-977d-218d836aebdb/manager/0.log" Apr 17 17:58:02.726606 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:02.726579 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-7vwns_286e6e57-04bf-44ea-87f4-b8e042bdcf20/monitoring-plugin/0.log" Apr 17 17:58:02.769504 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:02.769461 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gb9wj_768baf92-5c56-4bc1-91c9-eb70f2d76c7b/node-exporter/0.log" Apr 17 17:58:02.812372 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:02.812345 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gb9wj_768baf92-5c56-4bc1-91c9-eb70f2d76c7b/kube-rbac-proxy/0.log" Apr 17 17:58:02.849134 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:02.849110 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-gb9wj_768baf92-5c56-4bc1-91c9-eb70f2d76c7b/init-textfile/0.log" Apr 17 17:58:03.455876 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:03.455832 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t4qtq_57e3e3e5-be2c-461c-95d8-687cb6527c2f/prometheus-operator/0.log" Apr 17 17:58:03.489177 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:03.489126 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-t4qtq_57e3e3e5-be2c-461c-95d8-687cb6527c2f/kube-rbac-proxy/0.log" Apr 17 17:58:05.391762 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.391723 2565 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2"] Apr 17 17:58:05.392236 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392217 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="storage-initializer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392259 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="storage-initializer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392275 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="storage-initializer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392284 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="storage-initializer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392298 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="tokenizer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392306 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="tokenizer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392318 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerName="storage-initializer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392326 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerName="storage-initializer" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392336 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" Apr 17 17:58:05.392346 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392343 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392358 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392366 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392379 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392387 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392400 2565 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="llm-d-routing-sidecar" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392408 2565 state_mem.go:107] "Deleted CPUSet assignment" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="llm-d-routing-sidecar" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392507 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392522 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392532 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2e40adf7-9206-4b48-8e7b-2f5ef4822196" containerName="llm-d-routing-sidecar" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392543 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="54a44e95-7aae-42bc-8fa3-b779fa49c0bc" containerName="main" Apr 17 17:58:05.392784 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.392552 2565 memory_manager.go:356] "RemoveStaleState removing state" podUID="2df51df7-842a-4ce9-80d8-db314e9656ea" containerName="tokenizer" Apr 17 17:58:05.395947 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.395927 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.398207 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.398184 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6j5hz\"/\"kube-root-ca.crt\"" Apr 17 17:58:05.399275 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.399239 2565 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-6j5hz\"/\"openshift-service-ca.crt\"" Apr 17 17:58:05.399397 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.399379 2565 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-6j5hz\"/\"default-dockercfg-f7m6d\"" Apr 17 17:58:05.413134 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.413106 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2"] Apr 17 17:58:05.563708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.563665 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-lib-modules\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.563708 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.563706 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-proc\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.563968 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.563822 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c2vtj\" (UniqueName: \"kubernetes.io/projected/df0e03cb-9420-4757-9122-22755376f1e4-kube-api-access-c2vtj\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.563968 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.563871 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-podres\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.563968 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.563928 2565 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-sys\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665121 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665036 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-lib-modules\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665121 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665072 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-proc\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665121 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665116 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c2vtj\" (UniqueName: \"kubernetes.io/projected/df0e03cb-9420-4757-9122-22755376f1e4-kube-api-access-c2vtj\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665152 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-podres\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665180 2565 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-sys\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665182 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-proc\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665210 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-lib-modules\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665304 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-sys\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.665380 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.665358 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/df0e03cb-9420-4757-9122-22755376f1e4-podres\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.674053 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.674030 2565 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c2vtj\" (UniqueName: \"kubernetes.io/projected/df0e03cb-9420-4757-9122-22755376f1e4-kube-api-access-c2vtj\") pod \"perf-node-gather-daemonset-l9kz2\" (UID: \"df0e03cb-9420-4757-9122-22755376f1e4\") " pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.706117 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.706092 2565 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:05.829906 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.829872 2565 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2"] Apr 17 17:58:05.833292 ip-10-0-133-87 kubenswrapper[2565]: W0417 17:58:05.833264 2565 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-poddf0e03cb_9420_4757_9122_22755376f1e4.slice/crio-f046c9539d504496880d5e4bb5401cf117977db108d8724bb7cd71193f960f53 WatchSource:0}: Error finding container f046c9539d504496880d5e4bb5401cf117977db108d8724bb7cd71193f960f53: Status 404 returned error can't find the container with id f046c9539d504496880d5e4bb5401cf117977db108d8724bb7cd71193f960f53 Apr 17 17:58:05.834933 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.834917 2565 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 17 17:58:05.897624 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.897589 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" event={"ID":"df0e03cb-9420-4757-9122-22755376f1e4","Type":"ContainerStarted","Data":"f046c9539d504496880d5e4bb5401cf117977db108d8724bb7cd71193f960f53"} Apr 17 17:58:05.952968 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:05.952931 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-766d7bb8d4-ngwjt_daaa0496-35db-43d8-80e2-7df51aa0cbe7/console/0.log" Apr 17 17:58:06.904661 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:06.904629 2565 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" event={"ID":"df0e03cb-9420-4757-9122-22755376f1e4","Type":"ContainerStarted","Data":"2facf7507eb76775a3d4e86df0af18a42b6376d092b7b933ce0004052dbf12be"} Apr 17 17:58:06.904661 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:06.904679 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:06.923836 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:06.923732 2565 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" podStartSLOduration=1.923713685 podStartE2EDuration="1.923713685s" podCreationTimestamp="2026-04-17 17:58:05 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-17 17:58:06.922261194 +0000 UTC m=+2027.580305494" watchObservedRunningTime="2026-04-17 17:58:06.923713685 +0000 UTC m=+2027.581757993" Apr 17 17:58:07.410315 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:07.410273 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q2rfs_e5843e29-8a21-4a19-ab43-f6529fb056ca/dns/0.log" Apr 17 17:58:07.437263 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:07.437217 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-q2rfs_e5843e29-8a21-4a19-ab43-f6529fb056ca/kube-rbac-proxy/0.log" Apr 17 17:58:07.582863 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:07.582833 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wq4vk_7fe5d881-6f86-4878-aefd-8707bc6216fb/dns-node-resolver/0.log" Apr 17 17:58:08.084528 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:08.084488 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-5b896476cd-bjzsc_eba52262-b7bd-445f-9a9f-a8688fc5e324/registry/0.log" Apr 17 17:58:08.110433 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:08.110404 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mz8c6_7842b7a8-6434-47ce-8ce5-d1c63dd11da1/node-ca/0.log" Apr 17 17:58:09.110981 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:09.110940 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-zwmhm_5abf0c93-54f4-4998-afdd-68635ef2572c/discovery/0.log" Apr 17 17:58:09.146205 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:09.146174 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_openshift-ai-inference-openshift-default-7c5447bb76-trgbv_d47dd0f0-1374-487a-a2c1-0251ed74d094/istio-proxy/0.log" Apr 17 17:58:09.716138 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:09.716103 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-vvr96_f3128c11-df72-45ff-b3c0-ac28a0f059c1/serve-healthcheck-canary/0.log" Apr 17 17:58:10.209312 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:10.209284 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jrddb_f8a29699-81b9-4195-bf3c-aec014a689f0/kube-rbac-proxy/0.log" Apr 17 17:58:10.233977 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:10.233943 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jrddb_f8a29699-81b9-4195-bf3c-aec014a689f0/exporter/0.log" Apr 17 17:58:10.265561 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:10.265529 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-jrddb_f8a29699-81b9-4195-bf3c-aec014a689f0/extractor/0.log" Apr 17 17:58:12.918842 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:12.918817 2565 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-6j5hz/perf-node-gather-daemonset-l9kz2" Apr 17 17:58:13.802779 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:13.802733 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-85dd7cfb4d-rn92v_33b90b00-ed45-4e02-bc2c-6f2abe04274f/manager/0.log" Apr 17 17:58:13.920972 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:13.920940 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-tzhl6_8e84e5ce-7e2f-40d3-9055-3d12d32d8ff2/server/0.log" Apr 17 17:58:14.282584 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:14.282544 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-jnwcl_8f4284f1-4844-4c8b-872b-f59118b23f2a/manager/0.log" Apr 17 17:58:19.644970 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:19.644937 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-hm242_b7946c65-567b-4c5b-9c52-c4a315fd0c39/migrator/0.log" Apr 17 17:58:19.674818 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:19.674770 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-hm242_b7946c65-567b-4c5b-9c52-c4a315fd0c39/graceful-termination/0.log" Apr 17 17:58:21.276942 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.276909 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-86k48_5b8bc3fe-beaf-4464-9be8-84388596e77a/kube-multus/0.log" Apr 17 17:58:21.567567 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.567488 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/kube-multus-additional-cni-plugins/0.log" Apr 17 17:58:21.614392 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.614361 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/egress-router-binary-copy/0.log" Apr 17 17:58:21.659747 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.659719 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/cni-plugins/0.log" Apr 17 17:58:21.689112 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.689082 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/bond-cni-plugin/0.log" Apr 17 17:58:21.716730 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.716701 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/routeoverride-cni/0.log" Apr 17 17:58:21.742937 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.742915 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/whereabouts-cni-bincopy/0.log" Apr 17 17:58:21.771913 ip-10-0-133-87 kubenswrapper[2565]: I0417 17:58:21.771888 2565 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-4jwq8_5025af7c-9beb-4b9e-9e90-2e2d44ab467a/whereabouts-cni/0.log"