Apr 16 13:56:53.700539 ip-10-0-138-227 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 16 13:56:53.700553 ip-10-0-138-227 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 16 13:56:53.700562 ip-10-0-138-227 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 16 13:56:53.700873 ip-10-0-138-227 systemd[1]: Failed to start Kubernetes Kubelet. Apr 16 13:57:03.879566 ip-10-0-138-227 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 16 13:57:03.879584 ip-10-0-138-227 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot 935222622a824452a6d22028e39d4cb3 -- Apr 16 13:59:19.693556 ip-10-0-138-227 systemd[1]: Starting Kubernetes Kubelet... Apr 16 13:59:20.107073 ip-10-0-138-227 kubenswrapper[2569]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:20.107073 ip-10-0-138-227 kubenswrapper[2569]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 16 13:59:20.107073 ip-10-0-138-227 kubenswrapper[2569]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:20.107073 ip-10-0-138-227 kubenswrapper[2569]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 16 13:59:20.107073 ip-10-0-138-227 kubenswrapper[2569]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 16 13:59:20.110336 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.110232 2569 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 16 13:59:20.115037 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115023 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115039 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115044 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115047 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115050 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115053 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115056 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115059 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115062 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115065 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115067 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115072 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115075 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115077 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:20.115073 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115081 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115083 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115087 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115089 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115092 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115095 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115097 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115100 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115104 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115107 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115110 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115113 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115116 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115119 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115122 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115125 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115128 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115137 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115139 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:20.115425 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115142 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115145 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115147 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115150 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115152 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115155 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115157 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115160 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115162 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115165 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115168 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115171 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115174 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115176 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115179 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115182 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115185 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115187 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115190 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115193 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:20.115891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115195 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115198 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115200 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115203 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115206 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115208 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115211 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115214 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115216 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115219 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115221 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115224 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115226 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115229 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115231 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115233 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115236 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115238 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115241 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115243 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:20.116413 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115246 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115248 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115251 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115254 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115258 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115260 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115263 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115278 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115281 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115283 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115286 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115289 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115291 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115671 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115677 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115680 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115683 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115686 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115690 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:20.116891 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115694 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115697 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115700 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115703 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115706 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115709 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115712 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115714 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115717 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115720 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115722 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115725 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115728 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115730 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115733 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115735 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115738 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115741 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115744 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115746 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:20.117364 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115749 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115752 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115755 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115757 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115760 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115763 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115765 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115768 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115770 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115773 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115776 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115778 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115781 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115784 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115786 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115789 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115791 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115794 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115796 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:20.117893 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115799 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115802 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115804 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115808 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115811 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115813 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115816 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115819 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115821 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115825 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115828 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115833 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115836 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115838 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115841 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115844 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115846 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115849 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115852 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115854 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:20.118411 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115857 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115859 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115862 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115864 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115867 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115869 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115872 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115876 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115879 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115881 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115884 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115887 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115889 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115892 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115894 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115897 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115900 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115902 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115905 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115908 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:20.118887 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.115910 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116601 2569 flags.go:64] FLAG: --address="0.0.0.0" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116611 2569 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116617 2569 flags.go:64] FLAG: --anonymous-auth="true" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116622 2569 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116626 2569 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116630 2569 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116634 2569 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116638 2569 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116641 2569 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116644 2569 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116648 2569 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116651 2569 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116655 2569 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116657 2569 flags.go:64] FLAG: --cgroup-root="" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116660 2569 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116663 2569 flags.go:64] FLAG: --client-ca-file="" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116666 2569 flags.go:64] FLAG: --cloud-config="" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116669 2569 flags.go:64] FLAG: --cloud-provider="external" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116672 2569 flags.go:64] FLAG: --cluster-dns="[]" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116679 2569 flags.go:64] FLAG: --cluster-domain="" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116682 2569 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116686 2569 flags.go:64] FLAG: --config-dir="" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116689 2569 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116692 2569 flags.go:64] FLAG: --container-log-max-files="5" Apr 16 13:59:20.119380 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116696 2569 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116699 2569 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116702 2569 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116706 2569 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116709 2569 flags.go:64] FLAG: --contention-profiling="false" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116711 2569 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116714 2569 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116717 2569 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116720 2569 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116725 2569 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116728 2569 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116731 2569 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116734 2569 flags.go:64] FLAG: --enable-load-reader="false" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116737 2569 flags.go:64] FLAG: --enable-server="true" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116740 2569 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116744 2569 flags.go:64] FLAG: --event-burst="100" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116747 2569 flags.go:64] FLAG: --event-qps="50" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116750 2569 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116753 2569 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116756 2569 flags.go:64] FLAG: --eviction-hard="" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116760 2569 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116763 2569 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116766 2569 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116769 2569 flags.go:64] FLAG: --eviction-soft="" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116772 2569 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 16 13:59:20.119988 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116775 2569 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116778 2569 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116786 2569 flags.go:64] FLAG: --experimental-mounter-path="" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116789 2569 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116792 2569 flags.go:64] FLAG: --fail-swap-on="true" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116795 2569 flags.go:64] FLAG: --feature-gates="" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116799 2569 flags.go:64] FLAG: --file-check-frequency="20s" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116802 2569 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116805 2569 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116808 2569 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116811 2569 flags.go:64] FLAG: --healthz-port="10248" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116814 2569 flags.go:64] FLAG: --help="false" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116817 2569 flags.go:64] FLAG: --hostname-override="ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116820 2569 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116823 2569 flags.go:64] FLAG: --http-check-frequency="20s" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116826 2569 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116830 2569 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116833 2569 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116836 2569 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116838 2569 flags.go:64] FLAG: --image-service-endpoint="" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116842 2569 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116845 2569 flags.go:64] FLAG: --kube-api-burst="100" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116848 2569 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116851 2569 flags.go:64] FLAG: --kube-api-qps="50" Apr 16 13:59:20.120606 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116853 2569 flags.go:64] FLAG: --kube-reserved="" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116857 2569 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116860 2569 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116863 2569 flags.go:64] FLAG: --kubelet-cgroups="" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116865 2569 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116868 2569 flags.go:64] FLAG: --lock-file="" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116871 2569 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116873 2569 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116877 2569 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116882 2569 flags.go:64] FLAG: --log-json-split-stream="false" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116886 2569 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116889 2569 flags.go:64] FLAG: --log-text-split-stream="false" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116892 2569 flags.go:64] FLAG: --logging-format="text" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116895 2569 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116899 2569 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116901 2569 flags.go:64] FLAG: --manifest-url="" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116905 2569 flags.go:64] FLAG: --manifest-url-header="" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116910 2569 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116913 2569 flags.go:64] FLAG: --max-open-files="1000000" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116917 2569 flags.go:64] FLAG: --max-pods="110" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116921 2569 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116924 2569 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116927 2569 flags.go:64] FLAG: --memory-manager-policy="None" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116930 2569 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116933 2569 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 16 13:59:20.121185 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116936 2569 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116939 2569 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116950 2569 flags.go:64] FLAG: --node-status-max-images="50" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116953 2569 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116957 2569 flags.go:64] FLAG: --oom-score-adj="-999" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116960 2569 flags.go:64] FLAG: --pod-cidr="" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116963 2569 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:dc76bab72f320de3d4105c90d73c4fb139c09e20ce0fa8dcbc0cb59920d27dec" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116969 2569 flags.go:64] FLAG: --pod-manifest-path="" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116971 2569 flags.go:64] FLAG: --pod-max-pids="-1" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116974 2569 flags.go:64] FLAG: --pods-per-core="0" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116977 2569 flags.go:64] FLAG: --port="10250" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116980 2569 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116983 2569 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0615e4e97e9adc63b" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116987 2569 flags.go:64] FLAG: --qos-reserved="" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116989 2569 flags.go:64] FLAG: --read-only-port="10255" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116992 2569 flags.go:64] FLAG: --register-node="true" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116996 2569 flags.go:64] FLAG: --register-schedulable="true" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.116999 2569 flags.go:64] FLAG: --register-with-taints="" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117003 2569 flags.go:64] FLAG: --registry-burst="10" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117006 2569 flags.go:64] FLAG: --registry-qps="5" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117009 2569 flags.go:64] FLAG: --reserved-cpus="" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117012 2569 flags.go:64] FLAG: --reserved-memory="" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117015 2569 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117018 2569 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117021 2569 flags.go:64] FLAG: --rotate-certificates="false" Apr 16 13:59:20.121798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117024 2569 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117027 2569 flags.go:64] FLAG: --runonce="false" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117030 2569 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117033 2569 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117036 2569 flags.go:64] FLAG: --seccomp-default="false" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117038 2569 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117042 2569 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117044 2569 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117048 2569 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117050 2569 flags.go:64] FLAG: --storage-driver-password="root" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117053 2569 flags.go:64] FLAG: --storage-driver-secure="false" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117056 2569 flags.go:64] FLAG: --storage-driver-table="stats" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117059 2569 flags.go:64] FLAG: --storage-driver-user="root" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117062 2569 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117065 2569 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117068 2569 flags.go:64] FLAG: --system-cgroups="" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117070 2569 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117076 2569 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117079 2569 flags.go:64] FLAG: --tls-cert-file="" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117081 2569 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117086 2569 flags.go:64] FLAG: --tls-min-version="" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117089 2569 flags.go:64] FLAG: --tls-private-key-file="" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117092 2569 flags.go:64] FLAG: --topology-manager-policy="none" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117096 2569 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117099 2569 flags.go:64] FLAG: --topology-manager-scope="container" Apr 16 13:59:20.122404 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117103 2569 flags.go:64] FLAG: --v="2" Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117107 2569 flags.go:64] FLAG: --version="false" Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117111 2569 flags.go:64] FLAG: --vmodule="" Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117115 2569 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117118 2569 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117208 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117212 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117215 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117217 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117220 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117223 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117225 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117228 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117230 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117233 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117235 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117238 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117240 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117243 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117246 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117249 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:20.123030 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117252 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117254 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117257 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117259 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117262 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117278 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117281 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117283 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117286 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117290 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117293 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117295 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117298 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117301 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117303 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117306 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117308 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117311 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117313 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:20.123687 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117316 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117319 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117322 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117326 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117329 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117332 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117335 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117337 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117340 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117342 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117345 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117348 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117350 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117353 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117356 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117358 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117361 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117363 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117366 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:20.124449 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117369 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117371 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117374 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117378 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117380 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117383 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117386 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117389 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117393 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117396 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117398 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117401 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117404 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117407 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117409 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117412 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117415 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117417 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117420 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:20.124927 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117422 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117425 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117427 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117430 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117432 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117435 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117437 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117440 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117443 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117445 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117448 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117450 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.117453 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:20.125422 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.117932 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:20.125800 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.125783 2569 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 16 13:59:20.125829 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.125805 2569 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 16 13:59:20.125882 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125873 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125883 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125888 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125892 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125895 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125899 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125901 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125904 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125907 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125910 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125912 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125915 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:20.125913 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125918 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125922 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125927 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125930 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125933 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125935 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125938 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125941 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125943 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125946 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125948 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125952 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125956 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125960 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125965 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125968 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125971 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125973 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125976 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:20.126198 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125978 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125981 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125983 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125986 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125988 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125991 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125994 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125997 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.125999 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126001 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126004 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126006 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126009 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126011 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126014 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126017 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126020 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126023 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126026 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126029 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:20.126673 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126033 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126037 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126042 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126045 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126049 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126053 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126055 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126058 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126060 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126063 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126066 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126068 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126071 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126073 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126076 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126078 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126081 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126083 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126086 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:20.127154 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126089 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126091 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126094 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126097 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126099 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126102 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126105 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126110 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126114 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126119 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126122 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126124 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126127 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126129 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126132 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126134 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:20.127715 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.126139 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126243 2569 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126249 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126252 2569 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126254 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126257 2569 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126261 2569 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126277 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126282 2569 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126286 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126288 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126291 2569 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126293 2569 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126296 2569 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126298 2569 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126301 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126303 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126306 2569 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126310 2569 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 16 13:59:20.128160 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126314 2569 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126320 2569 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126325 2569 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126328 2569 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126332 2569 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126335 2569 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126338 2569 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126340 2569 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126343 2569 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126346 2569 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126349 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126354 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126358 2569 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126362 2569 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126365 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126368 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126370 2569 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126372 2569 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126375 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 16 13:59:20.128637 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126377 2569 feature_gate.go:328] unrecognized feature gate: Example2 Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126380 2569 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126383 2569 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126385 2569 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126388 2569 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126390 2569 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126393 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126395 2569 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126398 2569 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126400 2569 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126403 2569 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126405 2569 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126408 2569 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126411 2569 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126414 2569 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126417 2569 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126419 2569 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126422 2569 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126427 2569 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126431 2569 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 16 13:59:20.129093 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126436 2569 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126440 2569 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126443 2569 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126445 2569 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126448 2569 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126450 2569 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126453 2569 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126455 2569 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126458 2569 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126461 2569 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126463 2569 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126466 2569 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126468 2569 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126471 2569 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126473 2569 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126476 2569 feature_gate.go:328] unrecognized feature gate: Example Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126479 2569 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126482 2569 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126484 2569 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126487 2569 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 16 13:59:20.129592 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126489 2569 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126492 2569 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126495 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126497 2569 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126499 2569 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126502 2569 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126508 2569 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126514 2569 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:20.126518 2569 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.126523 2569 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 16 13:59:20.130066 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.127125 2569 server.go:962] "Client rotation is on, will bootstrap in background" Apr 16 13:59:20.130400 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.130191 2569 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 16 13:59:20.131112 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.131100 2569 server.go:1019] "Starting client certificate rotation" Apr 16 13:59:20.131217 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.131198 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:20.131258 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.131239 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 16 13:59:20.154737 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.154712 2569 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:20.158633 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.158610 2569 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 16 13:59:20.171102 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.171078 2569 log.go:25] "Validated CRI v1 runtime API" Apr 16 13:59:20.179118 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.179095 2569 log.go:25] "Validated CRI v1 image API" Apr 16 13:59:20.180896 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.180879 2569 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 16 13:59:20.185375 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.185352 2569 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 7e954aa9-f559-4857-8760-ff7a5edceebe:/dev/nvme0n1p4 db3d8d95-49c9-4d27-b728-531dbfc64660:/dev/nvme0n1p3] Apr 16 13:59:20.185437 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.185375 2569 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 16 13:59:20.187720 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.187704 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:20.190961 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.190824 2569 manager.go:217] Machine: {Timestamp:2026-04-16 13:59:20.189128305 +0000 UTC m=+0.382944231 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3107170 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec21bde8287071df40bb33d25f458f26 SystemUUID:ec21bde8-2870-71df-40bb-33d25f458f26 BootID:93522262-2a82-4452-a6d2-2028e39d4cb3 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6098944 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:8d:03:69:46:41 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:8d:03:69:46:41 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:96:8e:a5:d0:29:32 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 16 13:59:20.190961 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.190957 2569 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 16 13:59:20.191063 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.191041 2569 manager.go:233] Version: {KernelVersion:5.14.0-570.104.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260401-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 16 13:59:20.192014 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.191988 2569 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 16 13:59:20.192151 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.192017 2569 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-138-227.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 16 13:59:20.192200 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.192160 2569 topology_manager.go:138] "Creating topology manager with none policy" Apr 16 13:59:20.192200 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.192167 2569 container_manager_linux.go:306] "Creating device plugin manager" Apr 16 13:59:20.192200 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.192180 2569 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:20.192799 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.192789 2569 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 16 13:59:20.193997 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.193987 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:20.194109 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.194100 2569 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 16 13:59:20.196187 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.196178 2569 kubelet.go:491] "Attempting to sync node with API server" Apr 16 13:59:20.196226 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.196191 2569 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 16 13:59:20.196226 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.196204 2569 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 16 13:59:20.196226 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.196213 2569 kubelet.go:397] "Adding apiserver pod source" Apr 16 13:59:20.196226 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.196221 2569 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 16 13:59:20.197233 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.197221 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:20.197294 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.197240 2569 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 16 13:59:20.199944 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.199926 2569 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 16 13:59:20.201539 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.201526 2569 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 16 13:59:20.202719 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202702 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202723 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202732 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202743 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202753 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202763 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202773 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202786 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 16 13:59:20.202796 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202801 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 16 13:59:20.203144 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202813 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 16 13:59:20.203144 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202830 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 16 13:59:20.203144 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.202848 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 16 13:59:20.204425 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.204414 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 16 13:59:20.204465 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.204427 2569 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 16 13:59:20.208063 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.208051 2569 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 16 13:59:20.208124 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.208087 2569 server.go:1295] "Started kubelet" Apr 16 13:59:20.208227 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.208189 2569 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 16 13:59:20.208292 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.208250 2569 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 16 13:59:20.208346 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.208186 2569 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 16 13:59:20.209111 ip-10-0-138-227 systemd[1]: Started Kubernetes Kubelet. Apr 16 13:59:20.209362 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.209344 2569 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-138-227.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 16 13:59:20.209771 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.209643 2569 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 16 13:59:20.210170 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.210104 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-138-227.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 16 13:59:20.210221 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.210180 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 16 13:59:20.210423 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.210412 2569 server.go:317] "Adding debug handlers to kubelet server" Apr 16 13:59:20.214555 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.214255 2569 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:20.214789 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.214608 2569 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 16 13:59:20.217016 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.216384 2569 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 16 13:59:20.217016 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.216478 2569 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 16 13:59:20.217016 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.216580 2569 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 16 13:59:20.217016 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.216728 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.217016 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.216772 2569 reconstruct.go:97] "Volume reconstruction finished" Apr 16 13:59:20.217016 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.216785 2569 reconciler.go:26] "Reconciler: start to sync state" Apr 16 13:59:20.217312 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217021 2569 factory.go:55] Registering systemd factory Apr 16 13:59:20.217312 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217079 2569 factory.go:223] Registration of the systemd container factory successfully Apr 16 13:59:20.217698 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217676 2569 factory.go:153] Registering CRI-O factory Apr 16 13:59:20.217698 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217702 2569 factory.go:223] Registration of the crio container factory successfully Apr 16 13:59:20.217858 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217816 2569 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 16 13:59:20.217858 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217846 2569 factory.go:103] Registering Raw factory Apr 16 13:59:20.217943 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.217900 2569 manager.go:1196] Started watching for new ooms in manager Apr 16 13:59:20.218385 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.218368 2569 manager.go:319] Starting recovery of all containers Apr 16 13:59:20.220896 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.220861 2569 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 16 13:59:20.221555 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.221529 2569 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 16 13:59:20.221555 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.221542 2569 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-138-227.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 16 13:59:20.222611 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.221615 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-227.ec2.internal.18a6db0eedb0f96c default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-227.ec2.internal,UID:ip-10-0-138-227.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-138-227.ec2.internal,},FirstTimestamp:2026-04-16 13:59:20.208062828 +0000 UTC m=+0.401878754,LastTimestamp:2026-04-16 13:59:20.208062828 +0000 UTC m=+0.401878754,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-227.ec2.internal,}" Apr 16 13:59:20.231135 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.231024 2569 manager.go:324] Recovery completed Apr 16 13:59:20.232480 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.232462 2569 watcher.go:152] Failed to watch directory "/sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service": inotify_add_watch /sys/fs/cgroup/system.slice/systemd-update-utmp-runlevel.service: no such file or directory Apr 16 13:59:20.235729 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.235717 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:20.236948 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.236931 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vzsx7" Apr 16 13:59:20.238148 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.238134 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:20.238212 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.238162 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:20.238212 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.238173 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:20.238625 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.238612 2569 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 16 13:59:20.238625 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.238623 2569 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 16 13:59:20.238761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.238641 2569 state_mem.go:36] "Initialized new in-memory state store" Apr 16 13:59:20.239669 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.239569 2569 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-138-227.ec2.internal.18a6db0eef7c08a0 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-138-227.ec2.internal,UID:ip-10-0-138-227.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-138-227.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-138-227.ec2.internal,},FirstTimestamp:2026-04-16 13:59:20.238147744 +0000 UTC m=+0.431963669,LastTimestamp:2026-04-16 13:59:20.238147744 +0000 UTC m=+0.431963669,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-138-227.ec2.internal,}" Apr 16 13:59:20.241667 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.241651 2569 policy_none.go:49] "None policy: Start" Apr 16 13:59:20.241735 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.241671 2569 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 16 13:59:20.241735 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.241684 2569 state_mem.go:35] "Initializing new in-memory state store" Apr 16 13:59:20.243363 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.243346 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-vzsx7" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.286844 2569 manager.go:341] "Starting Device Plugin manager" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.286881 2569 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.286895 2569 server.go:85] "Starting device plugin registration server" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.287207 2569 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.287220 2569 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.287358 2569 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.287450 2569 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.287458 2569 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.287954 2569 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 16 13:59:20.294920 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.287994 2569 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.341323 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.341293 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 16 13:59:20.342564 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.342538 2569 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 16 13:59:20.342564 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.342566 2569 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 16 13:59:20.342728 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.342582 2569 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 16 13:59:20.342728 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.342589 2569 kubelet.go:2451] "Starting kubelet main sync loop" Apr 16 13:59:20.342728 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.342632 2569 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 16 13:59:20.345405 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.345388 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:20.387803 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.387760 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:20.388707 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.388690 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:20.388782 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.388723 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:20.388782 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.388736 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:20.388782 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.388768 2569 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.397445 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.397430 2569 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.397489 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.397452 2569 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-138-227.ec2.internal\": node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.443578 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.443533 2569 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal"] Apr 16 13:59:20.443658 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.443614 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:20.444370 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.444354 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:20.444437 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.444379 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:20.444437 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.444389 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:20.446663 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.446652 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:20.446811 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.446795 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.446850 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.446826 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:20.447275 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.447251 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:20.447275 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.447256 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:20.447364 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.447291 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:20.447364 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.447322 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:20.447364 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.447292 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:20.447454 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.447371 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:20.449995 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.449982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.450036 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.450008 2569 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 16 13:59:20.450634 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.450617 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientMemory" Apr 16 13:59:20.450694 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.450647 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasNoDiskPressure" Apr 16 13:59:20.450694 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.450659 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeHasSufficientPID" Apr 16 13:59:20.455696 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.455681 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.473781 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.473761 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-227.ec2.internal\" not found" node="ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.477000 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.476985 2569 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-138-227.ec2.internal\" not found" node="ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.518925 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.518898 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/923ac290859e7e5580d224aa00df9060-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal\" (UID: \"923ac290859e7e5580d224aa00df9060\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.519026 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.518928 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/923ac290859e7e5580d224aa00df9060-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal\" (UID: \"923ac290859e7e5580d224aa00df9060\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.519026 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.518946 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed5d0e13f67e25091a3eceaddbaeb6c3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-227.ec2.internal\" (UID: \"ed5d0e13f67e25091a3eceaddbaeb6c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.555828 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.555808 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.619383 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.619359 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/923ac290859e7e5580d224aa00df9060-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal\" (UID: \"923ac290859e7e5580d224aa00df9060\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.619474 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.619386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/923ac290859e7e5580d224aa00df9060-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal\" (UID: \"923ac290859e7e5580d224aa00df9060\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.619474 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.619405 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed5d0e13f67e25091a3eceaddbaeb6c3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-227.ec2.internal\" (UID: \"ed5d0e13f67e25091a3eceaddbaeb6c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.619576 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.619475 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/923ac290859e7e5580d224aa00df9060-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal\" (UID: \"923ac290859e7e5580d224aa00df9060\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.619576 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.619491 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/923ac290859e7e5580d224aa00df9060-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal\" (UID: \"923ac290859e7e5580d224aa00df9060\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.619576 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.619482 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/ed5d0e13f67e25091a3eceaddbaeb6c3-config\") pod \"kube-apiserver-proxy-ip-10-0-138-227.ec2.internal\" (UID: \"ed5d0e13f67e25091a3eceaddbaeb6c3\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.656437 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.656377 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.757121 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.757080 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.777280 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.777246 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.778976 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:20.778959 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" Apr 16 13:59:20.857771 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.857724 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:20.958330 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:20.958248 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.058840 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:21.058811 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.131330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.131306 2569 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 16 13:59:21.131840 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.131455 2569 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 16 13:59:21.159610 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:21.159590 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.214763 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.214709 2569 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 16 13:59:21.232879 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.232858 2569 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 16 13:59:21.245294 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.245251 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-15 13:54:20 +0000 UTC" deadline="2027-10-19 13:44:55.27460127 +0000 UTC" Apr 16 13:59:21.245294 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.245291 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="13223h45m34.029314249s" Apr 16 13:59:21.260407 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:21.260385 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.308697 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.308674 2569 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-jrsjx" Apr 16 13:59:21.317135 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.317116 2569 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-jrsjx" Apr 16 13:59:21.331767 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:21.331733 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod923ac290859e7e5580d224aa00df9060.slice/crio-b5fa519a6899c831d04f4c585b6d196d46b1c83516ab0f712c34e333baf6e229 WatchSource:0}: Error finding container b5fa519a6899c831d04f4c585b6d196d46b1c83516ab0f712c34e333baf6e229: Status 404 returned error can't find the container with id b5fa519a6899c831d04f4c585b6d196d46b1c83516ab0f712c34e333baf6e229 Apr 16 13:59:21.335544 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.335526 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 13:59:21.345131 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.345093 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" event={"ID":"923ac290859e7e5580d224aa00df9060","Type":"ContainerStarted","Data":"b5fa519a6899c831d04f4c585b6d196d46b1c83516ab0f712c34e333baf6e229"} Apr 16 13:59:21.361278 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:21.361238 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.379653 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:21.379632 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded5d0e13f67e25091a3eceaddbaeb6c3.slice/crio-f21d7dc4104d4b73800ff8ee5e13d34da777d0f7d5ec469f65640a29f53df946 WatchSource:0}: Error finding container f21d7dc4104d4b73800ff8ee5e13d34da777d0f7d5ec469f65640a29f53df946: Status 404 returned error can't find the container with id f21d7dc4104d4b73800ff8ee5e13d34da777d0f7d5ec469f65640a29f53df946 Apr 16 13:59:21.461919 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:21.461884 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.517432 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.517406 2569 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.562913 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:21.562889 2569 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-138-227.ec2.internal\" not found" Apr 16 13:59:21.566576 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.566559 2569 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.566746 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.566733 2569 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:21.615965 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.615947 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" Apr 16 13:59:21.650761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.650742 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:21.651877 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.651864 2569 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" Apr 16 13:59:21.673451 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:21.673432 2569 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 16 13:59:22.197623 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.197603 2569 apiserver.go:52] "Watching apiserver" Apr 16 13:59:22.209446 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.209425 2569 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 16 13:59:22.210472 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.210452 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-7pb8h","openshift-network-diagnostics/network-check-target-6spc4","openshift-ovn-kubernetes/ovnkube-node-xf96q","kube-system/global-pull-secret-syncer-fmbxr","kube-system/konnectivity-agent-xc2dz","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr","openshift-image-registry/node-ca-7qrkr","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal","openshift-multus/multus-additional-cni-plugins-2j4nh","openshift-multus/multus-lp46m","openshift-network-operator/iptables-alerter-2fd9m","kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal","openshift-cluster-node-tuning-operator/tuned-pmlcl","openshift-dns/node-resolver-qt8cq"] Apr 16 13:59:22.215073 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.215058 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.215154 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.215142 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.217326 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.217307 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.220083 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220025 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.220194 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220029 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.220338 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220238 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 16 13:59:22.220411 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220368 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.220411 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220393 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-27xcp\"" Apr 16 13:59:22.220520 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.220460 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:22.220994 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-d49wl\"" Apr 16 13:59:22.220994 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220910 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.220994 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.220886 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221544 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221582 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221591 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221606 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221624 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221663 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 16 13:59:22.221761 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.221612 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-6q6qc\"" Apr 16 13:59:22.222640 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.222522 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.224803 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.224785 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 16 13:59:22.224899 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.224850 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-hbb7x\"" Apr 16 13:59:22.224899 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.224866 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.224999 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.224978 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 16 13:59:22.227226 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.227335 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227217 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-jt6pm\"" Apr 16 13:59:22.227335 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227293 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.227442 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227217 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.227493 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227217 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 16 13:59:22.227626 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x6xm8\" (UniqueName: \"kubernetes.io/projected/bafb680b-a4fd-4b7b-931c-6d0198bbe401-kube-api-access-x6xm8\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.227685 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227639 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-run-netns\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227685 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227664 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227773 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227691 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-cni-bin\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227773 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227715 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-cni-netd\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227850 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-systemd\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227850 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227803 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-var-lib-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227850 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227827 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-etc-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227992 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227851 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-ovn\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227992 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227869 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovnkube-config\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.227992 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227911 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-kubernetes\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.227992 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.227957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysctl-conf\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228108 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228001 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-kubelet\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228108 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228065 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-sys\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228108 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228091 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-lib-modules\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228219 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228121 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-var-lib-kubelet\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228219 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228155 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thp76\" (UniqueName: \"kubernetes.io/projected/e4a16888-f41d-4786-b04f-bef76cad8d9a-kube-api-access-thp76\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228219 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228219 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228206 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bafb680b-a4fd-4b7b-931c-6d0198bbe401-host\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228256 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-systemd\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228290 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-host\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228305 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-systemd-units\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-slash\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-env-overrides\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovn-node-metrics-cert\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228420 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bafb680b-a4fd-4b7b-931c-6d0198bbe401-serviceca\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.228615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228471 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-run\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228502 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4a16888-f41d-4786-b04f-bef76cad8d9a-tmp\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228525 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qc5bw\" (UniqueName: \"kubernetes.io/projected/c58f9919-25e7-4c88-ac45-816be0bc0b3a-kube-api-access-qc5bw\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-tuned\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228596 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-log-socket\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228852 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228623 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovnkube-script-lib\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.228852 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-modprobe-d\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228852 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysconfig\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228852 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228714 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysctl-d\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.228852 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.228749 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-node-log\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.229590 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.229575 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.229731 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.229707 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 16 13:59:22.229824 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.229761 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 16 13:59:22.229824 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.229744 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:22.229824 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.229800 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-g24vs\"" Apr 16 13:59:22.229961 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.229802 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.230081 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.230066 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.230152 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.230073 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 16 13:59:22.231615 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.231599 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:22.231692 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.231651 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:22.233794 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.233781 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.236127 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.236105 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.236610 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.236595 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-7c6ww\"" Apr 16 13:59:22.236695 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.236679 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 16 13:59:22.238198 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.238182 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.238970 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.238953 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 16 13:59:22.239061 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.239004 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-9knzn\"" Apr 16 13:59:22.239061 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.238956 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.239226 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.239212 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.241109 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.241090 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-g9hw5\"" Apr 16 13:59:22.241197 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.241152 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 16 13:59:22.241352 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.241338 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 16 13:59:22.317999 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.317964 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:21 +0000 UTC" deadline="2027-11-16 05:04:51.625995509 +0000 UTC" Apr 16 13:59:22.317999 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.317991 2569 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 16 13:59:22.318261 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.317990 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13887h5m29.308008693s" Apr 16 13:59:22.329826 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329801 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-k8s-cni-cncf-io\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.329936 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329839 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0075fe44-19cb-4f01-845d-1e50708704ff-multus-daemon-config\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.329936 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329866 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2pfjr\" (UniqueName: \"kubernetes.io/projected/fd67681a-243f-4812-b07c-94c1aff03647-kube-api-access-2pfjr\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.329936 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329899 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-systemd\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.329936 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329928 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-var-lib-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329954 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-ovn\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-systemd\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.329981 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-etc-selinux\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330026 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-kubernetes\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330034 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-var-lib-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330050 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysctl-conf\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-ovn\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330074 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-kubelet\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330094 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-kubernetes\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330105 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-kubelet\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-kubelet\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/19600904-44a6-4d4d-aaf8-38af8d1e94b3-hosts-file\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330177 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330208 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysctl-conf\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330204 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330250 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-run-ovn-kubernetes\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330282 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-cni-multus\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330315 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-hostroot\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330358 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4txgx\" (UniqueName: \"kubernetes.io/projected/19600904-44a6-4d4d-aaf8-38af8d1e94b3-kube-api-access-4txgx\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330393 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-cni-bin\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330472 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-systemd\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330504 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-host\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330535 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-systemd\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330572 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/199e9928-49c7-45cd-93cb-793f77b87ea8-host-slash\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.330616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-host\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330644 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/865317bf-68d0-4437-94d2-7e1b8f99dbb1-konnectivity-ca\") pod \"konnectivity-agent-xc2dz\" (UID: \"865317bf-68d0-4437-94d2-7e1b8f99dbb1\") " pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330670 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-device-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330710 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330745 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4a16888-f41d-4786-b04f-bef76cad8d9a-tmp\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330809 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qc5bw\" (UniqueName: \"kubernetes.io/projected/c58f9919-25e7-4c88-ac45-816be0bc0b3a-kube-api-access-qc5bw\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09070b7d-cdb7-4268-8a8b-096e6b5cff88-dbus\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-tuned\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330931 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-log-socket\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.330983 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovnkube-script-lib\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331014 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-etc-kubernetes\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331030 2569 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331048 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-modprobe-d\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-log-socket\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysconfig\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331160 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysconfig\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331177 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-cni-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-modprobe-d\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331221 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-os-release\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331251 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/199e9928-49c7-45cd-93cb-793f77b87ea8-iptables-alerter-script\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331293 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331318 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19600904-44a6-4d4d-aaf8-38af8d1e94b3-tmp-dir\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331350 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x6xm8\" (UniqueName: \"kubernetes.io/projected/bafb680b-a4fd-4b7b-931c-6d0198bbe401-kube-api-access-x6xm8\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331374 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0075fe44-19cb-4f01-845d-1e50708704ff-cni-binary-copy\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331399 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-multus-certs\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331426 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-etc-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331454 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovnkube-config\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331480 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-sys-fs\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331503 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-system-cni-dir\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331526 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-cnibin\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331552 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331572 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-etc-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-sys\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.331929 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331609 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-lib-modules\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331729 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-lib-modules\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-sys\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331895 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-var-lib-kubelet\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thp76\" (UniqueName: \"kubernetes.io/projected/e4a16888-f41d-4786-b04f-bef76cad8d9a-kube-api-access-thp76\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331962 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331991 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-socket-dir-parent\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.331998 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-var-lib-kubelet\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332017 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-os-release\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332045 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bafb680b-a4fd-4b7b-931c-6d0198bbe401-host\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332047 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332070 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-systemd-units\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332100 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-slash\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332134 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-env-overrides\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332143 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-systemd-units\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovnkube-script-lib\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332184 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/bafb680b-a4fd-4b7b-931c-6d0198bbe401-host\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.332839 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332161 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovn-node-metrics-cert\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovnkube-config\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332247 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-system-cni-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332311 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bafb680b-a4fd-4b7b-931c-6d0198bbe401-serviceca\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332314 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-slash\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-run\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332399 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-run\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332435 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09070b7d-cdb7-4268-8a8b-096e6b5cff88-kubelet-config\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332461 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-cnibin\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-conf-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332521 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-52rst\" (UniqueName: \"kubernetes.io/projected/0075fe44-19cb-4f01-845d-1e50708704ff-kube-api-access-52rst\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332585 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332607 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/c58f9919-25e7-4c88-ac45-816be0bc0b3a-env-overrides\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332614 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332660 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-socket-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/bafb680b-a4fd-4b7b-931c-6d0198bbe401-serviceca\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.333520 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332722 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332762 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysctl-d\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332794 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-node-log\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332822 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f4cxg\" (UniqueName: \"kubernetes.io/projected/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-kube-api-access-f4cxg\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332847 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-netns\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332858 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-sysctl-d\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332892 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-node-log\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332930 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-swzp9\" (UniqueName: \"kubernetes.io/projected/199e9928-49c7-45cd-93cb-793f77b87ea8-kube-api-access-swzp9\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332957 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/865317bf-68d0-4437-94d2-7e1b8f99dbb1-agent-certs\") pod \"konnectivity-agent-xc2dz\" (UID: \"865317bf-68d0-4437-94d2-7e1b8f99dbb1\") " pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.332983 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-registration-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333010 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fsvv6\" (UniqueName: \"kubernetes.io/projected/5445e4c9-57ac-4c32-964e-2165416593b6-kube-api-access-fsvv6\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-run-netns\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-cni-bin\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333112 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-cni-netd\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333116 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-run-netns\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333118 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-run-openvswitch\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.334289 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333156 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-cni-bin\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.335093 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.333207 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/c58f9919-25e7-4c88-ac45-816be0bc0b3a-host-cni-netd\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.335093 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.334137 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/e4a16888-f41d-4786-b04f-bef76cad8d9a-etc-tuned\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.335093 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.334176 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/e4a16888-f41d-4786-b04f-bef76cad8d9a-tmp\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.335093 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.334968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/c58f9919-25e7-4c88-ac45-816be0bc0b3a-ovn-node-metrics-cert\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.336812 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.336794 2569 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 16 13:59:22.339842 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.339816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thp76\" (UniqueName: \"kubernetes.io/projected/e4a16888-f41d-4786-b04f-bef76cad8d9a-kube-api-access-thp76\") pod \"tuned-pmlcl\" (UID: \"e4a16888-f41d-4786-b04f-bef76cad8d9a\") " pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.340063 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.340043 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x6xm8\" (UniqueName: \"kubernetes.io/projected/bafb680b-a4fd-4b7b-931c-6d0198bbe401-kube-api-access-x6xm8\") pod \"node-ca-7qrkr\" (UID: \"bafb680b-a4fd-4b7b-931c-6d0198bbe401\") " pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.340883 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.340863 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qc5bw\" (UniqueName: \"kubernetes.io/projected/c58f9919-25e7-4c88-ac45-816be0bc0b3a-kube-api-access-qc5bw\") pod \"ovnkube-node-xf96q\" (UID: \"c58f9919-25e7-4c88-ac45-816be0bc0b3a\") " pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.347508 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.347482 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" event={"ID":"ed5d0e13f67e25091a3eceaddbaeb6c3","Type":"ContainerStarted","Data":"f21d7dc4104d4b73800ff8ee5e13d34da777d0f7d5ec469f65640a29f53df946"} Apr 16 13:59:22.433553 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433522 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-socket-dir-parent\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.433553 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433555 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-os-release\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433582 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433608 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-system-cni-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433633 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09070b7d-cdb7-4268-8a8b-096e6b5cff88-kubelet-config\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-socket-dir-parent\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433657 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-cnibin\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433670 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-os-release\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433721 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/09070b7d-cdb7-4268-8a8b-096e6b5cff88-kubelet-config\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433725 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-system-cni-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433763 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-conf-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.433778 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.433778 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433839 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-cnibin\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433783 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-52rst\" (UniqueName: \"kubernetes.io/projected/0075fe44-19cb-4f01-845d-1e50708704ff-kube-api-access-52rst\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433890 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-conf-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.433907 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.933871321 +0000 UTC m=+3.127687251 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433968 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.433997 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-socket-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434029 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434056 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f4cxg\" (UniqueName: \"kubernetes.io/projected/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-kube-api-access-f4cxg\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434089 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-netns\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434114 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-swzp9\" (UniqueName: \"kubernetes.io/projected/199e9928-49c7-45cd-93cb-793f77b87ea8-kube-api-access-swzp9\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434145 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/865317bf-68d0-4437-94d2-7e1b8f99dbb1-agent-certs\") pod \"konnectivity-agent-xc2dz\" (UID: \"865317bf-68d0-4437-94d2-7e1b8f99dbb1\") " pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434167 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-socket-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-registration-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434203 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-netns\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.434387 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434210 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fsvv6\" (UniqueName: \"kubernetes.io/projected/5445e4c9-57ac-4c32-964e-2165416593b6-kube-api-access-fsvv6\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434242 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-k8s-cni-cncf-io\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0075fe44-19cb-4f01-845d-1e50708704ff-multus-daemon-config\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2pfjr\" (UniqueName: \"kubernetes.io/projected/fd67681a-243f-4812-b07c-94c1aff03647-kube-api-access-2pfjr\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434345 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-etc-selinux\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-registration-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434371 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-kubelet\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434396 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/19600904-44a6-4d4d-aaf8-38af8d1e94b3-hosts-file\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434422 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-cni-multus\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434475 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-hostroot\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4txgx\" (UniqueName: \"kubernetes.io/projected/19600904-44a6-4d4d-aaf8-38af8d1e94b3-kube-api-access-4txgx\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434549 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-k8s-cni-cncf-io\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434564 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-cni-bin\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434594 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/199e9928-49c7-45cd-93cb-793f77b87ea8-host-slash\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/865317bf-68d0-4437-94d2-7e1b8f99dbb1-konnectivity-ca\") pod \"konnectivity-agent-xc2dz\" (UID: \"865317bf-68d0-4437-94d2-7e1b8f99dbb1\") " pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.434635 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434638 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-cni-binary-copy\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.435103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434644 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-device-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.434692 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret podName:09070b7d-cdb7-4268-8a8b-096e6b5cff88 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.934676771 +0000 UTC m=+3.128492684 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret") pod "global-pull-secret-syncer-fmbxr" (UID: "09070b7d-cdb7-4268-8a8b-096e6b5cff88") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434714 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434719 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/199e9928-49c7-45cd-93cb-793f77b87ea8-host-slash\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434749 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09070b7d-cdb7-4268-8a8b-096e6b5cff88-dbus\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434767 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-cni-bin\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434780 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-etc-kubernetes\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434807 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-cni-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-cni-multus\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434690 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-device-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434826 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-hostroot\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434830 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-os-release\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434865 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/199e9928-49c7-45cd-93cb-793f77b87ea8-iptables-alerter-script\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434887 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-os-release\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434890 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434918 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19600904-44a6-4d4d-aaf8-38af8d1e94b3-tmp-dir\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434932 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-etc-selinux\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.435987 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0075fe44-19cb-4f01-845d-1e50708704ff-cni-binary-copy\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434971 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-multus-certs\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434975 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-var-lib-kubelet\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-sys-fs\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.434999 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-tuning-conf-dir\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435020 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-system-cni-dir\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435056 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-etc-kubernetes\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435058 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-system-cni-dir\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435075 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/19600904-44a6-4d4d-aaf8-38af8d1e94b3-hosts-file\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435119 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-kubelet-dir\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435177 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/09070b7d-cdb7-4268-8a8b-096e6b5cff88-dbus\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435192 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/865317bf-68d0-4437-94d2-7e1b8f99dbb1-konnectivity-ca\") pod \"konnectivity-agent-xc2dz\" (UID: \"865317bf-68d0-4437-94d2-7e1b8f99dbb1\") " pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435209 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-cnibin\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435237 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435249 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-host-run-multus-certs\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435335 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5445e4c9-57ac-4c32-964e-2165416593b6-cnibin\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435379 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0075fe44-19cb-4f01-845d-1e50708704ff-multus-cni-dir\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.436980 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435382 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/19600904-44a6-4d4d-aaf8-38af8d1e94b3-tmp-dir\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435425 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/fd67681a-243f-4812-b07c-94c1aff03647-sys-fs\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435594 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/199e9928-49c7-45cd-93cb-793f77b87ea8-iptables-alerter-script\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.435676 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5445e4c9-57ac-4c32-964e-2165416593b6-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.436189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0075fe44-19cb-4f01-845d-1e50708704ff-cni-binary-copy\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.436189 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/0075fe44-19cb-4f01-845d-1e50708704ff-multus-daemon-config\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.437595 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.437327 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/865317bf-68d0-4437-94d2-7e1b8f99dbb1-agent-certs\") pod \"konnectivity-agent-xc2dz\" (UID: \"865317bf-68d0-4437-94d2-7e1b8f99dbb1\") " pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.441549 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.441527 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:22.441549 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.441550 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:22.441709 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.441562 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q92h8 for pod openshift-network-diagnostics/network-check-target-6spc4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:22.441709 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.441627 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8 podName:cab10f5b-6eb5-409e-a6a8-a1bf534e28e2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:22.94161087 +0000 UTC m=+3.135426800 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-q92h8" (UniqueName: "kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8") pod "network-check-target-6spc4" (UID: "cab10f5b-6eb5-409e-a6a8-a1bf534e28e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:22.444103 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.443809 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-52rst\" (UniqueName: \"kubernetes.io/projected/0075fe44-19cb-4f01-845d-1e50708704ff-kube-api-access-52rst\") pod \"multus-lp46m\" (UID: \"0075fe44-19cb-4f01-845d-1e50708704ff\") " pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.445791 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.445765 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-swzp9\" (UniqueName: \"kubernetes.io/projected/199e9928-49c7-45cd-93cb-793f77b87ea8-kube-api-access-swzp9\") pod \"iptables-alerter-2fd9m\" (UID: \"199e9928-49c7-45cd-93cb-793f77b87ea8\") " pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.446016 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.445994 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2pfjr\" (UniqueName: \"kubernetes.io/projected/fd67681a-243f-4812-b07c-94c1aff03647-kube-api-access-2pfjr\") pod \"aws-ebs-csi-driver-node-2hzgr\" (UID: \"fd67681a-243f-4812-b07c-94c1aff03647\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.446927 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.446906 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4txgx\" (UniqueName: \"kubernetes.io/projected/19600904-44a6-4d4d-aaf8-38af8d1e94b3-kube-api-access-4txgx\") pod \"node-resolver-qt8cq\" (UID: \"19600904-44a6-4d4d-aaf8-38af8d1e94b3\") " pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.447883 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.447826 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f4cxg\" (UniqueName: \"kubernetes.io/projected/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-kube-api-access-f4cxg\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.448354 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.448333 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fsvv6\" (UniqueName: \"kubernetes.io/projected/5445e4c9-57ac-4c32-964e-2165416593b6-kube-api-access-fsvv6\") pod \"multus-additional-cni-plugins-2j4nh\" (UID: \"5445e4c9-57ac-4c32-964e-2165416593b6\") " pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.525538 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.525438 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7qrkr" Apr 16 13:59:22.531946 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.531920 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" Apr 16 13:59:22.537576 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.537546 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbafb680b_a4fd_4b7b_931c_6d0198bbe401.slice/crio-70aa9588401b32a51e7cbc7892beb51d7698f0cc0d748e901a4efd82434f35ed WatchSource:0}: Error finding container 70aa9588401b32a51e7cbc7892beb51d7698f0cc0d748e901a4efd82434f35ed: Status 404 returned error can't find the container with id 70aa9588401b32a51e7cbc7892beb51d7698f0cc0d748e901a4efd82434f35ed Apr 16 13:59:22.538814 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.538790 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:22.541589 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.541424 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode4a16888_f41d_4786_b04f_bef76cad8d9a.slice/crio-ea563e9c9f5be476e6a32d59dc95cf866c78d21340963060b5e6105811e412cf WatchSource:0}: Error finding container ea563e9c9f5be476e6a32d59dc95cf866c78d21340963060b5e6105811e412cf: Status 404 returned error can't find the container with id ea563e9c9f5be476e6a32d59dc95cf866c78d21340963060b5e6105811e412cf Apr 16 13:59:22.544004 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.543983 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:22.548078 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.548019 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc58f9919_25e7_4c88_ac45_816be0bc0b3a.slice/crio-cf0fb7b5c8a768d4e856149000ae489b5b5504d12c62a53bd9e1d512246cb022 WatchSource:0}: Error finding container cf0fb7b5c8a768d4e856149000ae489b5b5504d12c62a53bd9e1d512246cb022: Status 404 returned error can't find the container with id cf0fb7b5c8a768d4e856149000ae489b5b5504d12c62a53bd9e1d512246cb022 Apr 16 13:59:22.548585 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.548562 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" Apr 16 13:59:22.556756 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.556731 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" Apr 16 13:59:22.557223 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.557199 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod865317bf_68d0_4437_94d2_7e1b8f99dbb1.slice/crio-c4f014eb835394f295f5fb67be231f1256162ab25c5a154af80657d231ead3df WatchSource:0}: Error finding container c4f014eb835394f295f5fb67be231f1256162ab25c5a154af80657d231ead3df: Status 404 returned error can't find the container with id c4f014eb835394f295f5fb67be231f1256162ab25c5a154af80657d231ead3df Apr 16 13:59:22.562619 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.562597 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-lp46m" Apr 16 13:59:22.562858 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.562822 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfd67681a_243f_4812_b07c_94c1aff03647.slice/crio-e9c4339ffa9dc5048acf260af88e4d7df1872ca510999fa95ce9026a8570b95e WatchSource:0}: Error finding container e9c4339ffa9dc5048acf260af88e4d7df1872ca510999fa95ce9026a8570b95e: Status 404 returned error can't find the container with id e9c4339ffa9dc5048acf260af88e4d7df1872ca510999fa95ce9026a8570b95e Apr 16 13:59:22.569658 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.569638 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-2fd9m" Apr 16 13:59:22.570260 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.570239 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5445e4c9_57ac_4c32_964e_2165416593b6.slice/crio-8609311edcb2a857b8f81f500cf2b23ab16abb2896aa12c1a9ed870370363766 WatchSource:0}: Error finding container 8609311edcb2a857b8f81f500cf2b23ab16abb2896aa12c1a9ed870370363766: Status 404 returned error can't find the container with id 8609311edcb2a857b8f81f500cf2b23ab16abb2896aa12c1a9ed870370363766 Apr 16 13:59:22.573628 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.573607 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0075fe44_19cb_4f01_845d_1e50708704ff.slice/crio-f9aad4cd948837d70aaab196cc4a65c8dc34f2116fb1b45c5f12f118eea64385 WatchSource:0}: Error finding container f9aad4cd948837d70aaab196cc4a65c8dc34f2116fb1b45c5f12f118eea64385: Status 404 returned error can't find the container with id f9aad4cd948837d70aaab196cc4a65c8dc34f2116fb1b45c5f12f118eea64385 Apr 16 13:59:22.574907 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.574856 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-qt8cq" Apr 16 13:59:22.582446 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.582423 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod199e9928_49c7_45cd_93cb_793f77b87ea8.slice/crio-dd204db90bf8d0d0afec41ecbea9498d993f087bde10061e1ab74afafda59916 WatchSource:0}: Error finding container dd204db90bf8d0d0afec41ecbea9498d993f087bde10061e1ab74afafda59916: Status 404 returned error can't find the container with id dd204db90bf8d0d0afec41ecbea9498d993f087bde10061e1ab74afafda59916 Apr 16 13:59:22.586759 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:22.586732 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod19600904_44a6_4d4d_aaf8_38af8d1e94b3.slice/crio-905c55e85c29163cbf9a693cc678914185bd592416b4b1227757663f56eb5dea WatchSource:0}: Error finding container 905c55e85c29163cbf9a693cc678914185bd592416b4b1227757663f56eb5dea: Status 404 returned error can't find the container with id 905c55e85c29163cbf9a693cc678914185bd592416b4b1227757663f56eb5dea Apr 16 13:59:22.938538 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.938450 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:22.938538 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:22.938512 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:22.938749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.938603 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:22.938749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.938643 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:22.938749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.938680 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret podName:09070b7d-cdb7-4268-8a8b-096e6b5cff88 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.938660984 +0000 UTC m=+4.132476913 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret") pod "global-pull-secret-syncer-fmbxr" (UID: "09070b7d-cdb7-4268-8a8b-096e6b5cff88") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:22.938749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:22.938700 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:23.938687813 +0000 UTC m=+4.132503725 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.039481 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.039441 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:23.039656 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.039632 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:23.039737 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.039668 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:23.039737 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.039682 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q92h8 for pod openshift-network-diagnostics/network-check-target-6spc4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.039837 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.039756 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8 podName:cab10f5b-6eb5-409e-a6a8-a1bf534e28e2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:24.039736564 +0000 UTC m=+4.233552491 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-q92h8" (UniqueName: "kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8") pod "network-check-target-6spc4" (UID: "cab10f5b-6eb5-409e-a6a8-a1bf534e28e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:23.319239 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.319149 2569 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-15 13:54:21 +0000 UTC" deadline="2027-10-14 21:56:33.952909295 +0000 UTC" Apr 16 13:59:23.319239 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.319185 2569 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13111h57m10.633727964s" Apr 16 13:59:23.343641 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.343616 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:23.343775 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.343616 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:23.346113 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.344060 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:23.346113 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.344652 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:23.349401 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.349377 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2fd9m" event={"ID":"199e9928-49c7-45cd-93cb-793f77b87ea8","Type":"ContainerStarted","Data":"dd204db90bf8d0d0afec41ecbea9498d993f087bde10061e1ab74afafda59916"} Apr 16 13:59:23.350163 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.350140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" event={"ID":"fd67681a-243f-4812-b07c-94c1aff03647","Type":"ContainerStarted","Data":"e9c4339ffa9dc5048acf260af88e4d7df1872ca510999fa95ce9026a8570b95e"} Apr 16 13:59:23.350874 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.350856 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xc2dz" event={"ID":"865317bf-68d0-4437-94d2-7e1b8f99dbb1","Type":"ContainerStarted","Data":"c4f014eb835394f295f5fb67be231f1256162ab25c5a154af80657d231ead3df"} Apr 16 13:59:23.351576 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.351560 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7qrkr" event={"ID":"bafb680b-a4fd-4b7b-931c-6d0198bbe401","Type":"ContainerStarted","Data":"70aa9588401b32a51e7cbc7892beb51d7698f0cc0d748e901a4efd82434f35ed"} Apr 16 13:59:23.352376 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.352355 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qt8cq" event={"ID":"19600904-44a6-4d4d-aaf8-38af8d1e94b3","Type":"ContainerStarted","Data":"905c55e85c29163cbf9a693cc678914185bd592416b4b1227757663f56eb5dea"} Apr 16 13:59:23.353314 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.353287 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lp46m" event={"ID":"0075fe44-19cb-4f01-845d-1e50708704ff","Type":"ContainerStarted","Data":"f9aad4cd948837d70aaab196cc4a65c8dc34f2116fb1b45c5f12f118eea64385"} Apr 16 13:59:23.354057 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.354040 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerStarted","Data":"8609311edcb2a857b8f81f500cf2b23ab16abb2896aa12c1a9ed870370363766"} Apr 16 13:59:23.354886 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.354853 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"cf0fb7b5c8a768d4e856149000ae489b5b5504d12c62a53bd9e1d512246cb022"} Apr 16 13:59:23.355668 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.355648 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" event={"ID":"e4a16888-f41d-4786-b04f-bef76cad8d9a","Type":"ContainerStarted","Data":"ea563e9c9f5be476e6a32d59dc95cf866c78d21340963060b5e6105811e412cf"} Apr 16 13:59:23.946660 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.946547 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:23.946819 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:23.946671 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:23.946887 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.946846 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:23.946943 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.946910 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret podName:09070b7d-cdb7-4268-8a8b-096e6b5cff88 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.946891767 +0000 UTC m=+6.140707691 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret") pod "global-pull-secret-syncer-fmbxr" (UID: "09070b7d-cdb7-4268-8a8b-096e6b5cff88") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:23.947225 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.947206 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:23.947322 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:23.947285 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:25.947252954 +0000 UTC m=+6.141068871 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:24.047141 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:24.047038 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:24.047327 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:24.047233 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:24.047327 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:24.047257 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:24.047327 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:24.047289 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q92h8 for pod openshift-network-diagnostics/network-check-target-6spc4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.047481 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:24.047355 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8 podName:cab10f5b-6eb5-409e-a6a8-a1bf534e28e2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:26.047336483 +0000 UTC m=+6.241152425 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-q92h8" (UniqueName: "kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8") pod "network-check-target-6spc4" (UID: "cab10f5b-6eb5-409e-a6a8-a1bf534e28e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:24.345000 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:24.344328 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:24.345000 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:24.344464 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:24.365779 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:24.365743 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" event={"ID":"ed5d0e13f67e25091a3eceaddbaeb6c3","Type":"ContainerStarted","Data":"1f8eac6e63023e49377a2795e9e30c3d25063cc2d3071e8a1065a6f14788aac7"} Apr 16 13:59:24.382945 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:24.382890 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-138-227.ec2.internal" podStartSLOduration=3.382870561 podStartE2EDuration="3.382870561s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:24.382408951 +0000 UTC m=+4.576224887" watchObservedRunningTime="2026-04-16 13:59:24.382870561 +0000 UTC m=+4.576686492" Apr 16 13:59:25.342904 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:25.342766 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:25.343048 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:25.342903 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:25.343325 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:25.343263 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:25.343400 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:25.343372 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:25.380951 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:25.380678 2569 generic.go:358] "Generic (PLEG): container finished" podID="923ac290859e7e5580d224aa00df9060" containerID="924a15e4666a9189cc512920d95113b287fb7fbde5ed2a0b87b64dc6554559ef" exitCode=0 Apr 16 13:59:25.380951 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:25.380767 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" event={"ID":"923ac290859e7e5580d224aa00df9060","Type":"ContainerDied","Data":"924a15e4666a9189cc512920d95113b287fb7fbde5ed2a0b87b64dc6554559ef"} Apr 16 13:59:25.961791 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:25.961754 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:25.961961 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:25.961831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:25.962035 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:25.961969 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.962035 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:25.962026 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.962008764 +0000 UTC m=+10.155824683 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:25.962150 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:25.962099 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:25.962150 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:25.962135 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret podName:09070b7d-cdb7-4268-8a8b-096e6b5cff88 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:29.962124341 +0000 UTC m=+10.155940260 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret") pod "global-pull-secret-syncer-fmbxr" (UID: "09070b7d-cdb7-4268-8a8b-096e6b5cff88") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:26.062210 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:26.062176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:26.062392 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:26.062353 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:26.062392 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:26.062372 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:26.062392 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:26.062383 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q92h8 for pod openshift-network-diagnostics/network-check-target-6spc4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.062545 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:26.062440 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8 podName:cab10f5b-6eb5-409e-a6a8-a1bf534e28e2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:30.062418915 +0000 UTC m=+10.256234847 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-q92h8" (UniqueName: "kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8") pod "network-check-target-6spc4" (UID: "cab10f5b-6eb5-409e-a6a8-a1bf534e28e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:26.349213 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:26.348465 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:26.349213 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:26.348650 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:27.342965 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:27.342934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:27.343413 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:27.342934 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:27.343413 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:27.343106 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:27.343413 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:27.343180 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:28.343959 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:28.343913 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:28.344395 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:28.344042 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:29.344379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:29.343628 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:29.344379 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:29.343760 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:29.344379 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:29.344148 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:29.344379 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:29.344250 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:29.993435 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:29.993400 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:29.993618 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:29.993590 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.993678 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:29.993651 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.993632365 +0000 UTC m=+18.187448291 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:29.993678 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:29.993652 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:29.993678 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:29.993587 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:29.993891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:29.993697 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret podName:09070b7d-cdb7-4268-8a8b-096e6b5cff88 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:37.993686344 +0000 UTC m=+18.187502275 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret") pod "global-pull-secret-syncer-fmbxr" (UID: "09070b7d-cdb7-4268-8a8b-096e6b5cff88") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:30.094960 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:30.094904 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:30.095143 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:30.095091 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:30.095143 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:30.095112 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:30.095143 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:30.095124 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q92h8 for pod openshift-network-diagnostics/network-check-target-6spc4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:30.095345 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:30.095179 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8 podName:cab10f5b-6eb5-409e-a6a8-a1bf534e28e2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:38.095160582 +0000 UTC m=+18.288976515 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-q92h8" (UniqueName: "kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8") pod "network-check-target-6spc4" (UID: "cab10f5b-6eb5-409e-a6a8-a1bf534e28e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:30.344434 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:30.344357 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:30.344917 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:30.344472 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:31.342860 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:31.342825 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:31.343040 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:31.342835 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:31.343040 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:31.342990 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:31.343147 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:31.343063 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:32.343638 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:32.343551 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:32.344148 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:32.343695 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:33.343526 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:33.343498 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:33.343692 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:33.343497 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:33.343692 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:33.343601 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:33.344166 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:33.343681 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:34.343639 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:34.343605 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:34.343796 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:34.343718 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:35.343151 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:35.343116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:35.343353 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:35.343116 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:35.343353 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:35.343222 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:35.343442 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:35.343356 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:36.343724 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:36.343688 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:36.344200 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:36.343812 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:37.343634 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:37.343599 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:37.343882 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:37.343599 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:37.343882 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:37.343732 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:37.343882 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:37.343846 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:38.052518 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:38.052477 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:38.052709 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:38.052537 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:38.052709 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.052645 2569 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:38.052709 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.052656 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.052864 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.052712 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret podName:09070b7d-cdb7-4268-8a8b-096e6b5cff88 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.052694869 +0000 UTC m=+34.246510787 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret") pod "global-pull-secret-syncer-fmbxr" (UID: "09070b7d-cdb7-4268-8a8b-096e6b5cff88") : object "kube-system"/"original-pull-secret" not registered Apr 16 13:59:38.052864 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.052734 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.052722998 +0000 UTC m=+34.246538914 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 16 13:59:38.153581 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:38.153550 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:38.153749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.153689 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 16 13:59:38.153749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.153708 2569 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 16 13:59:38.153749 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.153721 2569 projected.go:194] Error preparing data for projected volume kube-api-access-q92h8 for pod openshift-network-diagnostics/network-check-target-6spc4: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:38.153894 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.153781 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8 podName:cab10f5b-6eb5-409e-a6a8-a1bf534e28e2 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.153764048 +0000 UTC m=+34.347579963 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-q92h8" (UniqueName: "kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8") pod "network-check-target-6spc4" (UID: "cab10f5b-6eb5-409e-a6a8-a1bf534e28e2") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 16 13:59:38.343750 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:38.343680 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:38.343885 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:38.343776 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:39.343063 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:39.343030 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:39.343516 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:39.343030 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:39.343516 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:39.343159 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:39.343516 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:39.343223 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:40.344000 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:40.343973 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:40.344367 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:40.344108 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:40.452059 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:40.451220 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" event={"ID":"923ac290859e7e5580d224aa00df9060","Type":"ContainerStarted","Data":"6b11707d81050164abb324895b37c4e482a8f0f020817aa37f95329baf67a73b"} Apr 16 13:59:40.455616 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:40.455523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" event={"ID":"e4a16888-f41d-4786-b04f-bef76cad8d9a","Type":"ContainerStarted","Data":"5770068ad9d9fde9fcaba16ad42b7846f3fb0380041ab26740acb54c41dd5291"} Apr 16 13:59:40.470620 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:40.470175 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-138-227.ec2.internal" podStartSLOduration=19.470155673 podStartE2EDuration="19.470155673s" podCreationTimestamp="2026-04-16 13:59:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 13:59:40.46901715 +0000 UTC m=+20.662833086" watchObservedRunningTime="2026-04-16 13:59:40.470155673 +0000 UTC m=+20.663971609" Apr 16 13:59:41.343919 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.343721 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:41.344063 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.343732 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:41.344063 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:41.344005 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:41.344690 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:41.344071 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:41.458335 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.458230 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" event={"ID":"fd67681a-243f-4812-b07c-94c1aff03647","Type":"ContainerStarted","Data":"f84ea0f7259f1bf4923ae41e79cd44c2ad2823c430c6b6c6d4d49a918cdb6570"} Apr 16 13:59:41.459785 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.459752 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-xc2dz" event={"ID":"865317bf-68d0-4437-94d2-7e1b8f99dbb1","Type":"ContainerStarted","Data":"ee6afb16c7efa3476acd6abfad20efab2b26eb766f7a316e8937aef169801b93"} Apr 16 13:59:41.461161 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.461088 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7qrkr" event={"ID":"bafb680b-a4fd-4b7b-931c-6d0198bbe401","Type":"ContainerStarted","Data":"06b98d52969669388c8be755f062e60557c1dbf058199508fba5bd3e7fdf45a9"} Apr 16 13:59:41.462395 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.462371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-qt8cq" event={"ID":"19600904-44a6-4d4d-aaf8-38af8d1e94b3","Type":"ContainerStarted","Data":"a1e38daf4a7e583a5e7fabbeb9eb7a1d6859b4580fd23e72570a94a25ff572f9"} Apr 16 13:59:41.463861 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.463834 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-lp46m" event={"ID":"0075fe44-19cb-4f01-845d-1e50708704ff","Type":"ContainerStarted","Data":"e2987f4c15762cd2b63a1e45875f70a706447704b6f65904e12ea6aa1f9f74db"} Apr 16 13:59:41.465194 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.465169 2569 generic.go:358] "Generic (PLEG): container finished" podID="5445e4c9-57ac-4c32-964e-2165416593b6" containerID="c7f45f79ba3a3b5bbe6801f9d286025b9642a27ef51c43769f0a8c1f7981142b" exitCode=0 Apr 16 13:59:41.465320 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.465239 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerDied","Data":"c7f45f79ba3a3b5bbe6801f9d286025b9642a27ef51c43769f0a8c1f7981142b"} Apr 16 13:59:41.468521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.468304 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"62730e987712ecab6c86c16ee4f1c14bdd395400beba13ff301eb12f7a5f7e6b"} Apr 16 13:59:41.468521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.468333 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"4a3d9c966409986a042f8695e209cf9e9a82a97259448edad5a2beae2a077023"} Apr 16 13:59:41.468521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.468348 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"106d1bdb5e26d95dae1ab3a2c1bf1a7abf5136e1c1f1e183a10f160d6d29b99d"} Apr 16 13:59:41.468521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.468371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"dbac2bcee4ccd7e95c47738b3b5fe7bb14ea2fa0d219d7b1fdae57b38cd928f0"} Apr 16 13:59:41.468521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.468382 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"058ccc4af4853f5f3a45caf6afd1a50ec1cb87db6ec63954aae6e52c75edb476"} Apr 16 13:59:41.468521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.468394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"8752c6ed5e9ad4983cd2660adbe8e9d18a6078bf94da8ce4dafbf4208208e933"} Apr 16 13:59:41.477426 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.477382 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-pmlcl" podStartSLOduration=3.9801385 podStartE2EDuration="21.47736772s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.543915314 +0000 UTC m=+2.737731243" lastFinishedPulling="2026-04-16 13:59:40.041144538 +0000 UTC m=+20.234960463" observedRunningTime="2026-04-16 13:59:40.49343841 +0000 UTC m=+20.687254547" watchObservedRunningTime="2026-04-16 13:59:41.47736772 +0000 UTC m=+21.671183653" Apr 16 13:59:41.503085 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.503035 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-xc2dz" podStartSLOduration=4.022200815 podStartE2EDuration="21.503021828s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.560324463 +0000 UTC m=+2.754140377" lastFinishedPulling="2026-04-16 13:59:40.041145462 +0000 UTC m=+20.234961390" observedRunningTime="2026-04-16 13:59:41.47708081 +0000 UTC m=+21.670896745" watchObservedRunningTime="2026-04-16 13:59:41.503021828 +0000 UTC m=+21.696837763" Apr 16 13:59:41.503431 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.503388 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-lp46m" podStartSLOduration=3.847372457 podStartE2EDuration="21.503379361s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.577207953 +0000 UTC m=+2.771023870" lastFinishedPulling="2026-04-16 13:59:40.233214847 +0000 UTC m=+20.427030774" observedRunningTime="2026-04-16 13:59:41.502901039 +0000 UTC m=+21.696716976" watchObservedRunningTime="2026-04-16 13:59:41.503379361 +0000 UTC m=+21.697195298" Apr 16 13:59:41.543531 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.543484 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-qt8cq" podStartSLOduration=4.091471623 podStartE2EDuration="21.543471238s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.589120842 +0000 UTC m=+2.782936758" lastFinishedPulling="2026-04-16 13:59:40.04112046 +0000 UTC m=+20.234936373" observedRunningTime="2026-04-16 13:59:41.523469178 +0000 UTC m=+21.717285113" watchObservedRunningTime="2026-04-16 13:59:41.543471238 +0000 UTC m=+21.737287172" Apr 16 13:59:41.570888 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.570850 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7qrkr" podStartSLOduration=3.906429944 podStartE2EDuration="21.570836919s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.543311103 +0000 UTC m=+2.737127021" lastFinishedPulling="2026-04-16 13:59:40.207718074 +0000 UTC m=+20.401533996" observedRunningTime="2026-04-16 13:59:41.543244092 +0000 UTC m=+21.737060026" watchObservedRunningTime="2026-04-16 13:59:41.570836919 +0000 UTC m=+21.764652854" Apr 16 13:59:41.888594 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:41.888571 2569 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 16 13:59:42.298737 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.298570 2569 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-16T13:59:41.888589939Z","UUID":"d72c0e7f-d382-4eb1-8de1-44cfb5074f84","Handler":null,"Name":"","Endpoint":""} Apr 16 13:59:42.301859 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.301836 2569 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 16 13:59:42.301985 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.301869 2569 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 16 13:59:42.343490 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.343462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:42.343646 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:42.343580 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:42.473509 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.473471 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-2fd9m" event={"ID":"199e9928-49c7-45cd-93cb-793f77b87ea8","Type":"ContainerStarted","Data":"0a85b93dabdb45f6b3388618bfe26a8317d6e2bc307a5a684a954d779120ac5e"} Apr 16 13:59:42.477123 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.477095 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" event={"ID":"fd67681a-243f-4812-b07c-94c1aff03647","Type":"ContainerStarted","Data":"dc67645b3582fb7a7bbc3ba09225be5a8e45b95b9ccb6901d72c6c339f69598c"} Apr 16 13:59:42.504526 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.504495 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:42.505138 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.505121 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:42.507705 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:42.507665 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-2fd9m" podStartSLOduration=4.88515302 podStartE2EDuration="22.507654622s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.585202821 +0000 UTC m=+2.779018739" lastFinishedPulling="2026-04-16 13:59:40.20770442 +0000 UTC m=+20.401520341" observedRunningTime="2026-04-16 13:59:42.507114842 +0000 UTC m=+22.700930780" watchObservedRunningTime="2026-04-16 13:59:42.507654622 +0000 UTC m=+22.701470584" Apr 16 13:59:43.343764 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:43.343529 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:43.343960 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:43.343535 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:43.343960 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:43.343827 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:43.344077 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:43.343951 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:43.481418 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:43.481386 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" event={"ID":"fd67681a-243f-4812-b07c-94c1aff03647","Type":"ContainerStarted","Data":"1b12a7a909570fcb5f7de5b6e907644cfc494ed25b9352f7f83c0b8bd3fcd33c"} Apr 16 13:59:43.484873 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:43.484843 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"fa7cdd7cd82df87180a08268e9c860ff9db66eab50a9c7eff5e62a8a0453a3d7"} Apr 16 13:59:43.503256 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:43.503212 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-2hzgr" podStartSLOduration=3.408740054 podStartE2EDuration="23.503197573s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.565285203 +0000 UTC m=+2.759101116" lastFinishedPulling="2026-04-16 13:59:42.659742708 +0000 UTC m=+22.853558635" observedRunningTime="2026-04-16 13:59:43.502764462 +0000 UTC m=+23.696580420" watchObservedRunningTime="2026-04-16 13:59:43.503197573 +0000 UTC m=+23.697013507" Apr 16 13:59:44.342972 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:44.342944 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:44.343175 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:44.343066 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:44.486425 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:44.486397 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:45.343385 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:45.343351 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:45.343563 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:45.343352 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:45.343563 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:45.343478 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:45.343563 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:45.343531 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:46.343991 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.343775 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:46.344631 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:46.344014 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:46.491184 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.491150 2569 generic.go:358] "Generic (PLEG): container finished" podID="5445e4c9-57ac-4c32-964e-2165416593b6" containerID="13a0648f4a8dab4e4a4b656b703ca0897fd5405579ff3a38dbb751cf10f4877d" exitCode=0 Apr 16 13:59:46.491325 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.491232 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerDied","Data":"13a0648f4a8dab4e4a4b656b703ca0897fd5405579ff3a38dbb751cf10f4877d"} Apr 16 13:59:46.494191 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.494171 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" event={"ID":"c58f9919-25e7-4c88-ac45-816be0bc0b3a","Type":"ContainerStarted","Data":"ed3b8115ee127284738bf0cc387614a09c1e592b49403d660060682a38607528"} Apr 16 13:59:46.494498 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.494416 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:46.494498 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.494433 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:46.508247 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.508228 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:46.508345 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.508309 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:46.541529 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:46.541496 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" podStartSLOduration=8.53035736 podStartE2EDuration="26.54148644s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.550088721 +0000 UTC m=+2.743904637" lastFinishedPulling="2026-04-16 13:59:40.561217801 +0000 UTC m=+20.755033717" observedRunningTime="2026-04-16 13:59:46.539941436 +0000 UTC m=+26.733757372" watchObservedRunningTime="2026-04-16 13:59:46.54148644 +0000 UTC m=+26.735302372" Apr 16 13:59:47.343514 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.343298 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:47.343514 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.343298 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:47.343514 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:47.343456 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:47.343514 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:47.343508 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:47.498272 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.498244 2569 generic.go:358] "Generic (PLEG): container finished" podID="5445e4c9-57ac-4c32-964e-2165416593b6" containerID="542c9875cbd74a3e80ae129f11ddadb40aaa2df8148ab325b6be4e37e6211d20" exitCode=0 Apr 16 13:59:47.498591 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.498325 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerDied","Data":"542c9875cbd74a3e80ae129f11ddadb40aaa2df8148ab325b6be4e37e6211d20"} Apr 16 13:59:47.498591 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.498525 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:47.593171 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.593100 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fmbxr"] Apr 16 13:59:47.593331 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.593220 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:47.593331 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:47.593316 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:47.597495 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.597472 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6spc4"] Apr 16 13:59:47.597604 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.597588 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:47.597698 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:47.597679 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:47.598112 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.598096 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7pb8h"] Apr 16 13:59:47.598196 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:47.598183 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:47.598308 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:47.598287 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:48.500074 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:48.500049 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:49.343082 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:49.343041 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:49.343233 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:49.343041 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:49.343233 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:49.343148 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:49.343233 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:49.343228 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:49.343385 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:49.343046 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:49.343385 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:49.343320 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:49.503408 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:49.503324 2569 generic.go:358] "Generic (PLEG): container finished" podID="5445e4c9-57ac-4c32-964e-2165416593b6" containerID="6a0f80e3c52b93e172c44c4ff59040b62e1261b7d2656a952a1a23dd30cc759d" exitCode=0 Apr 16 13:59:49.503408 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:49.503394 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerDied","Data":"6a0f80e3c52b93e172c44c4ff59040b62e1261b7d2656a952a1a23dd30cc759d"} Apr 16 13:59:50.027209 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:50.027182 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 13:59:50.027424 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:50.027409 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:50.037643 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:50.037484 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" podUID="c58f9919-25e7-4c88-ac45-816be0bc0b3a" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 13:59:50.045536 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:50.045509 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" podUID="c58f9919-25e7-4c88-ac45-816be0bc0b3a" containerName="ovnkube-controller" probeResult="failure" output="" Apr 16 13:59:51.309220 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:51.309131 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:51.309690 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:51.309296 2569 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 16 13:59:51.310456 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:51.310436 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-xc2dz" Apr 16 13:59:51.343208 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:51.343180 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:51.343343 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:51.343186 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:51.343343 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:51.343306 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-fmbxr" podUID="09070b7d-cdb7-4268-8a8b-096e6b5cff88" Apr 16 13:59:51.343461 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:51.343186 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:51.343510 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:51.343463 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-6spc4" podUID="cab10f5b-6eb5-409e-a6a8-a1bf534e28e2" Apr 16 13:59:51.343652 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:51.343626 2569 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-7pb8h" podUID="1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17" Apr 16 13:59:53.138801 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.138727 2569 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-138-227.ec2.internal" event="NodeReady" Apr 16 13:59:53.139367 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.138890 2569 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 16 13:59:53.182161 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.182130 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-b96c7f5f9-wlwtx"] Apr 16 13:59:53.214432 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.214393 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-pj7tn"] Apr 16 13:59:53.214604 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.214548 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.217819 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.217756 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 16 13:59:53.217819 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.217798 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 16 13:59:53.217819 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.217805 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-khk5g\"" Apr 16 13:59:53.218027 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.217865 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 16 13:59:53.224813 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.224791 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 16 13:59:53.231044 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.231023 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b96c7f5f9-wlwtx"] Apr 16 13:59:53.231137 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.231051 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-gwt4s"] Apr 16 13:59:53.231206 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.231167 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.233893 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.233876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 16 13:59:53.234134 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.234116 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 16 13:59:53.234240 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.234223 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjgsd\"" Apr 16 13:59:53.241389 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.241371 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pj7tn"] Apr 16 13:59:53.241472 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.241397 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gwt4s"] Apr 16 13:59:53.241541 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.241511 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.245441 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.245425 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 16 13:59:53.245541 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.245488 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pt57w\"" Apr 16 13:59:53.245541 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.245526 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 16 13:59:53.245678 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.245655 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 16 13:59:53.343748 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.343718 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:53.343911 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.343762 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:53.343911 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.343767 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:53.347118 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.347096 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 16 13:59:53.347249 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.347142 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 16 13:59:53.347249 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.347236 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p5bbb\"" Apr 16 13:59:53.347249 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.347244 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-fjvc5\"" Apr 16 13:59:53.347423 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.347333 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 16 13:59:53.347423 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.347382 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 16 13:59:53.367135 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367113 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367242 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367148 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-installation-pull-secrets\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367242 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8bjdk\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-kube-api-access-8bjdk\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367348 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367253 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e960e93-2ea2-4f10-b638-d01eb132a93d-config-volume\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.367348 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367319 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.367348 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367341 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e960e93-2ea2-4f10-b638-d01eb132a93d-tmp-dir\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.367464 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367360 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-image-registry-private-configuration\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367464 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fr97f\" (UniqueName: \"kubernetes.io/projected/8e960e93-2ea2-4f10-b638-d01eb132a93d-kube-api-access-fr97f\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.367464 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367411 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d0a3d85-9abd-4994-ba30-3173587f16f3-ca-trust-extracted\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367464 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-trusted-ca\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367464 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.367625 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367493 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p8gts\" (UniqueName: \"kubernetes.io/projected/1c7e2532-bb8b-4069-a258-035e33ccef02-kube-api-access-p8gts\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.367625 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-certificates\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.367625 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.367543 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-bound-sa-token\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468409 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468312 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468409 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-installation-pull-secrets\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468637 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468438 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8bjdk\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-kube-api-access-8bjdk\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468637 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.468478 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:53.468637 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.468499 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 13:59:53.468637 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.468563 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.968543112 +0000 UTC m=+34.162359042 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 13:59:53.468637 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468493 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e960e93-2ea2-4f10-b638-d01eb132a93d-config-volume\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468666 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468701 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e960e93-2ea2-4f10-b638-d01eb132a93d-tmp-dir\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468729 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-image-registry-private-configuration\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.468745 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.468786 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.968772781 +0000 UTC m=+34.162588696 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468804 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fr97f\" (UniqueName: \"kubernetes.io/projected/8e960e93-2ea2-4f10-b638-d01eb132a93d-kube-api-access-fr97f\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468831 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d0a3d85-9abd-4994-ba30-3173587f16f3-ca-trust-extracted\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468854 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-trusted-ca\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.468890 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468885 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468912 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p8gts\" (UniqueName: \"kubernetes.io/projected/1c7e2532-bb8b-4069-a258-035e33ccef02-kube-api-access-p8gts\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468936 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-certificates\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.468963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-bound-sa-token\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.469035 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e960e93-2ea2-4f10-b638-d01eb132a93d-tmp-dir\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.469059 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/8e960e93-2ea2-4f10-b638-d01eb132a93d-config-volume\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.469148 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:53.469346 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.469227 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:53.96921144 +0000 UTC m=+34.163027366 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 13:59:53.469605 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.469580 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d0a3d85-9abd-4994-ba30-3173587f16f3-ca-trust-extracted\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.469656 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.469628 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-certificates\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.470008 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.469982 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-trusted-ca\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.473135 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.473109 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-image-registry-private-configuration\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.473311 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.473294 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-installation-pull-secrets\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.478734 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.478707 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fr97f\" (UniqueName: \"kubernetes.io/projected/8e960e93-2ea2-4f10-b638-d01eb132a93d-kube-api-access-fr97f\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.478849 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.478763 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-bound-sa-token\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.478908 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.478850 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p8gts\" (UniqueName: \"kubernetes.io/projected/1c7e2532-bb8b-4069-a258-035e33ccef02-kube-api-access-p8gts\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.479440 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.479419 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8bjdk\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-kube-api-access-8bjdk\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.972910 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.972864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.972935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:53.972983 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973037 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973067 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973096 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973107 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 13:59:53.973118 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973119 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.973098995 +0000 UTC m=+35.166914920 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:53.973424 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973139 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.973129502 +0000 UTC m=+35.166945421 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 13:59:53.973424 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:53.973176 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:54.973159573 +0000 UTC m=+35.166975508 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 13:59:54.073390 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.073349 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 13:59:54.073611 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.073446 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:54.073611 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.073508 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 13:59:54.073611 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.073586 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:26.073564932 +0000 UTC m=+66.267380851 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : secret "metrics-daemon-secret" not found Apr 16 13:59:54.082034 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.082013 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/09070b7d-cdb7-4268-8a8b-096e6b5cff88-original-pull-secret\") pod \"global-pull-secret-syncer-fmbxr\" (UID: \"09070b7d-cdb7-4268-8a8b-096e6b5cff88\") " pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:54.173931 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.173894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:54.176748 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.176715 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q92h8\" (UniqueName: \"kubernetes.io/projected/cab10f5b-6eb5-409e-a6a8-a1bf534e28e2-kube-api-access-q92h8\") pod \"network-check-target-6spc4\" (UID: \"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2\") " pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:54.253741 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.253649 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 13:59:54.259985 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.259957 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-fmbxr" Apr 16 13:59:54.936909 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.936874 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns"] Apr 16 13:59:54.967833 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.967802 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp"] Apr 16 13:59:54.968006 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.967956 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:54.970844 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.970748 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 16 13:59:54.970844 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.970750 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 16 13:59:54.970844 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.970749 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 16 13:59:54.970844 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.970821 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 16 13:59:54.971128 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.970881 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-m6kvz\"" Apr 16 13:59:54.983622 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.983598 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:54.983736 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.983647 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:54.983736 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.983688 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:54.983736 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983721 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:54.983891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983772 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 13:59:56.983753862 +0000 UTC m=+37.177569786 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:54.983891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983804 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:54.983891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983823 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:54.983891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983837 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 13:59:54.983891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983854 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:56.983838538 +0000 UTC m=+37.177654453 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 13:59:54.983891 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:54.983877 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 13:59:56.983863511 +0000 UTC m=+37.177679440 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 13:59:54.985902 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.985884 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns"] Apr 16 13:59:54.985995 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.985932 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc"] Apr 16 13:59:54.986058 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.986047 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:54.989019 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:54.988998 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 16 13:59:55.004569 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.004545 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp"] Apr 16 13:59:55.004656 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.004572 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc"] Apr 16 13:59:55.004703 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.004677 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.007596 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.007576 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 16 13:59:55.007596 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.007590 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 16 13:59:55.007726 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.007597 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 16 13:59:55.007726 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.007597 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 16 13:59:55.084068 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084036 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/493ef4c5-267e-40ae-a87d-08710679a67c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-69b87b5899-rxcns\" (UID: \"493ef4c5-267e-40ae-a87d-08710679a67c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.084068 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084069 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-ca\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.084309 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084107 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vqgjc\" (UniqueName: \"kubernetes.io/projected/493ef4c5-267e-40ae-a87d-08710679a67c-kube-api-access-vqgjc\") pod \"managed-serviceaccount-addon-agent-69b87b5899-rxcns\" (UID: \"493ef4c5-267e-40ae-a87d-08710679a67c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.084309 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084158 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0181aa92-2818-44a3-834d-446bb7ae47b0-klusterlet-config\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.084309 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084187 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.084309 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084213 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9kgtz\" (UniqueName: \"kubernetes.io/projected/09f5aee5-4aee-460f-a0f2-2434d35686af-kube-api-access-9kgtz\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.084309 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084296 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0181aa92-2818-44a3-834d-446bb7ae47b0-tmp\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.084565 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084333 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-hub\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.084565 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084364 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.084565 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084387 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/09f5aee5-4aee-460f-a0f2-2434d35686af-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.084565 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.084410 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrbx6\" (UniqueName: \"kubernetes.io/projected/0181aa92-2818-44a3-834d-446bb7ae47b0-kube-api-access-rrbx6\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.185731 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185684 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-hub\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.185731 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185731 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185759 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/09f5aee5-4aee-460f-a0f2-2434d35686af-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185802 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrbx6\" (UniqueName: \"kubernetes.io/projected/0181aa92-2818-44a3-834d-446bb7ae47b0-kube-api-access-rrbx6\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185871 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/493ef4c5-267e-40ae-a87d-08710679a67c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-69b87b5899-rxcns\" (UID: \"493ef4c5-267e-40ae-a87d-08710679a67c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185898 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-ca\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185934 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vqgjc\" (UniqueName: \"kubernetes.io/projected/493ef4c5-267e-40ae-a87d-08710679a67c-kube-api-access-vqgjc\") pod \"managed-serviceaccount-addon-agent-69b87b5899-rxcns\" (UID: \"493ef4c5-267e-40ae-a87d-08710679a67c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185963 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0181aa92-2818-44a3-834d-446bb7ae47b0-klusterlet-config\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185980 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.185998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9kgtz\" (UniqueName: \"kubernetes.io/projected/09f5aee5-4aee-460f-a0f2-2434d35686af-kube-api-access-9kgtz\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.186330 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.186041 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0181aa92-2818-44a3-834d-446bb7ae47b0-tmp\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.186750 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.186402 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0181aa92-2818-44a3-834d-446bb7ae47b0-tmp\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.187153 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.187052 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/09f5aee5-4aee-460f-a0f2-2434d35686af-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.189152 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.189124 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-ca\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.189152 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.189145 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/0181aa92-2818-44a3-834d-446bb7ae47b0-klusterlet-config\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.189339 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.189191 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.189339 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.189306 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/493ef4c5-267e-40ae-a87d-08710679a67c-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-69b87b5899-rxcns\" (UID: \"493ef4c5-267e-40ae-a87d-08710679a67c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.189545 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.189527 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-hub\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.189945 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.189928 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/09f5aee5-4aee-460f-a0f2-2434d35686af-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.195482 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.195458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9kgtz\" (UniqueName: \"kubernetes.io/projected/09f5aee5-4aee-460f-a0f2-2434d35686af-kube-api-access-9kgtz\") pod \"cluster-proxy-proxy-agent-79bdb79488-bjfvc\" (UID: \"09f5aee5-4aee-460f-a0f2-2434d35686af\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.195928 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.195908 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrbx6\" (UniqueName: \"kubernetes.io/projected/0181aa92-2818-44a3-834d-446bb7ae47b0-kube-api-access-rrbx6\") pod \"klusterlet-addon-workmgr-f95855bf7-6xtgp\" (UID: \"0181aa92-2818-44a3-834d-446bb7ae47b0\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.196261 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.196238 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vqgjc\" (UniqueName: \"kubernetes.io/projected/493ef4c5-267e-40ae-a87d-08710679a67c-kube-api-access-vqgjc\") pod \"managed-serviceaccount-addon-agent-69b87b5899-rxcns\" (UID: \"493ef4c5-267e-40ae-a87d-08710679a67c\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.291952 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.291921 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" Apr 16 13:59:55.299076 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.299058 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 13:59:55.313848 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.313822 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 13:59:55.698937 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.698678 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-fmbxr"] Apr 16 13:59:55.706129 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.706106 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-6spc4"] Apr 16 13:59:55.718345 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.718326 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns"] Apr 16 13:59:55.724684 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.724663 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc"] Apr 16 13:59:55.728078 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:55.728060 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp"] Apr 16 13:59:55.760367 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:55.760288 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09070b7d_cdb7_4268_8a8b_096e6b5cff88.slice/crio-586b2cd1804b6cdece1c26050a756ef8d4a63d7db7493ced463e8216511945fc WatchSource:0}: Error finding container 586b2cd1804b6cdece1c26050a756ef8d4a63d7db7493ced463e8216511945fc: Status 404 returned error can't find the container with id 586b2cd1804b6cdece1c26050a756ef8d4a63d7db7493ced463e8216511945fc Apr 16 13:59:55.761116 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:55.761091 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcab10f5b_6eb5_409e_a6a8_a1bf534e28e2.slice/crio-9125f82cffdc4aee19c529aa888735ff577d41cfbb341bafc6757c04a46c7f10 WatchSource:0}: Error finding container 9125f82cffdc4aee19c529aa888735ff577d41cfbb341bafc6757c04a46c7f10: Status 404 returned error can't find the container with id 9125f82cffdc4aee19c529aa888735ff577d41cfbb341bafc6757c04a46c7f10 Apr 16 13:59:55.761816 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:55.761770 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493ef4c5_267e_40ae_a87d_08710679a67c.slice/crio-2dee5524147920a63a0a39b172a4344b5de96ca27d90534cb05b8be072372ada WatchSource:0}: Error finding container 2dee5524147920a63a0a39b172a4344b5de96ca27d90534cb05b8be072372ada: Status 404 returned error can't find the container with id 2dee5524147920a63a0a39b172a4344b5de96ca27d90534cb05b8be072372ada Apr 16 13:59:55.763447 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:55.763115 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod09f5aee5_4aee_460f_a0f2_2434d35686af.slice/crio-f228c82c281ff683032e3cb99cf974d67365cbd7ccdf1093c452b988ae8f8a90 WatchSource:0}: Error finding container f228c82c281ff683032e3cb99cf974d67365cbd7ccdf1093c452b988ae8f8a90: Status 404 returned error can't find the container with id f228c82c281ff683032e3cb99cf974d67365cbd7ccdf1093c452b988ae8f8a90 Apr 16 13:59:55.763512 ip-10-0-138-227 kubenswrapper[2569]: W0416 13:59:55.763477 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0181aa92_2818_44a3_834d_446bb7ae47b0.slice/crio-e8ad3f5aa2abbca3b36c2634bd1b9091f05134e75490407a70f3c90cda593197 WatchSource:0}: Error finding container e8ad3f5aa2abbca3b36c2634bd1b9091f05134e75490407a70f3c90cda593197: Status 404 returned error can't find the container with id e8ad3f5aa2abbca3b36c2634bd1b9091f05134e75490407a70f3c90cda593197 Apr 16 13:59:56.520887 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.520785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" event={"ID":"09f5aee5-4aee-460f-a0f2-2434d35686af","Type":"ContainerStarted","Data":"f228c82c281ff683032e3cb99cf974d67365cbd7ccdf1093c452b988ae8f8a90"} Apr 16 13:59:56.522499 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.522469 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" event={"ID":"493ef4c5-267e-40ae-a87d-08710679a67c","Type":"ContainerStarted","Data":"2dee5524147920a63a0a39b172a4344b5de96ca27d90534cb05b8be072372ada"} Apr 16 13:59:56.527052 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.527015 2569 generic.go:358] "Generic (PLEG): container finished" podID="5445e4c9-57ac-4c32-964e-2165416593b6" containerID="3c2a55fd3a509d1c08cd1625ba62029b4fb594d82379ed5f9c487a95640d1d14" exitCode=0 Apr 16 13:59:56.527171 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.527085 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerDied","Data":"3c2a55fd3a509d1c08cd1625ba62029b4fb594d82379ed5f9c487a95640d1d14"} Apr 16 13:59:56.532152 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.532089 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" event={"ID":"0181aa92-2818-44a3-834d-446bb7ae47b0","Type":"ContainerStarted","Data":"e8ad3f5aa2abbca3b36c2634bd1b9091f05134e75490407a70f3c90cda593197"} Apr 16 13:59:56.535923 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.535873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6spc4" event={"ID":"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2","Type":"ContainerStarted","Data":"9125f82cffdc4aee19c529aa888735ff577d41cfbb341bafc6757c04a46c7f10"} Apr 16 13:59:56.537524 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:56.537484 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fmbxr" event={"ID":"09070b7d-cdb7-4268-8a8b-096e6b5cff88","Type":"ContainerStarted","Data":"586b2cd1804b6cdece1c26050a756ef8d4a63d7db7493ced463e8216511945fc"} Apr 16 13:59:57.000521 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:57.000484 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 13:59:57.000798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:57.000595 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 13:59:57.000798 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:57.000637 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 13:59:57.000798 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.000788 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 13:59:57.000966 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.000847 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:01.000829528 +0000 UTC m=+41.194645446 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 13:59:57.001303 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.001284 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 13:59:57.001303 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.001305 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 13:59:57.001508 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.001352 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:01.001336221 +0000 UTC m=+41.195152138 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 13:59:57.001508 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.001421 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 13:59:57.001508 ip-10-0-138-227 kubenswrapper[2569]: E0416 13:59:57.001452 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:01.001441361 +0000 UTC m=+41.195257278 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 13:59:57.554547 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:57.554076 2569 generic.go:358] "Generic (PLEG): container finished" podID="5445e4c9-57ac-4c32-964e-2165416593b6" containerID="c484d3f5224ab5f4d9a8feea27b4010a749136865cddde20c43bf00382101512" exitCode=0 Apr 16 13:59:57.554547 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:57.554140 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerDied","Data":"c484d3f5224ab5f4d9a8feea27b4010a749136865cddde20c43bf00382101512"} Apr 16 13:59:58.567685 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:58.566860 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" event={"ID":"5445e4c9-57ac-4c32-964e-2165416593b6","Type":"ContainerStarted","Data":"58a9b2ad4675b2930b8c96ad85a6127502ca3f57fefd0d96a6ca1a2750421bcc"} Apr 16 13:59:58.600715 ip-10-0-138-227 kubenswrapper[2569]: I0416 13:59:58.599162 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-2j4nh" podStartSLOduration=5.372459138 podStartE2EDuration="38.599142973s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:22.572731992 +0000 UTC m=+2.766547905" lastFinishedPulling="2026-04-16 13:59:55.799415813 +0000 UTC m=+35.993231740" observedRunningTime="2026-04-16 13:59:58.598631688 +0000 UTC m=+38.792447623" watchObservedRunningTime="2026-04-16 13:59:58.599142973 +0000 UTC m=+38.792958906" Apr 16 14:00:01.037919 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:01.037880 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:01.037930 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:01.037976 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038053 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038064 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038087 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038104 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.038091197 +0000 UTC m=+49.231907110 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038105 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038128 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.038110788 +0000 UTC m=+49.231926709 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 14:00:01.038427 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:01.038153 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:09.038140986 +0000 UTC m=+49.231956898 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 14:00:06.590306 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.590252 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-6spc4" event={"ID":"cab10f5b-6eb5-409e-a6a8-a1bf534e28e2","Type":"ContainerStarted","Data":"9a0f3692d9f4c6a661407ac3eb749a97aa76f866519fc555886bcc229ebd0158"} Apr 16 14:00:06.590720 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.590350 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 14:00:06.591636 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.591616 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-fmbxr" event={"ID":"09070b7d-cdb7-4268-8a8b-096e6b5cff88","Type":"ContainerStarted","Data":"7f0922ed11ae252d15a6c21f7e5140c82b03953da79e139fb1cf6d1525b0dad1"} Apr 16 14:00:06.592892 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.592873 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" event={"ID":"09f5aee5-4aee-460f-a0f2-2434d35686af","Type":"ContainerStarted","Data":"3741bec7e250cc1001a12faa40eb9d85c4f472d206ceba6dc58fe383e9633eed"} Apr 16 14:00:06.594001 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.593980 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" event={"ID":"493ef4c5-267e-40ae-a87d-08710679a67c","Type":"ContainerStarted","Data":"fef2acdd61297d5b5ec47be2ffefe60e254a512dd4c2979017332f70cb484eec"} Apr 16 14:00:06.595076 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.595058 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" event={"ID":"0181aa92-2818-44a3-834d-446bb7ae47b0","Type":"ContainerStarted","Data":"37abfa33cc383e2c4df419a3f15170fa766d77a85cf53791240df802bc1dfe7b"} Apr 16 14:00:06.595250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.595236 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 14:00:06.596852 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.596835 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" Apr 16 14:00:06.607518 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.607484 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-6spc4" podStartSLOduration=36.651304033 podStartE2EDuration="46.607474201s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:55.77472469 +0000 UTC m=+35.968540604" lastFinishedPulling="2026-04-16 14:00:05.730894855 +0000 UTC m=+45.924710772" observedRunningTime="2026-04-16 14:00:06.6065099 +0000 UTC m=+46.800325830" watchObservedRunningTime="2026-04-16 14:00:06.607474201 +0000 UTC m=+46.801290135" Apr 16 14:00:06.622857 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.622821 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-f95855bf7-6xtgp" podStartSLOduration=2.650454413 podStartE2EDuration="12.622811302s" podCreationTimestamp="2026-04-16 13:59:54 +0000 UTC" firstStartedPulling="2026-04-16 13:59:55.774750463 +0000 UTC m=+35.968566376" lastFinishedPulling="2026-04-16 14:00:05.747107337 +0000 UTC m=+45.940923265" observedRunningTime="2026-04-16 14:00:06.622378052 +0000 UTC m=+46.816193987" watchObservedRunningTime="2026-04-16 14:00:06.622811302 +0000 UTC m=+46.816627248" Apr 16 14:00:06.638180 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.638138 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-fmbxr" podStartSLOduration=36.669590196 podStartE2EDuration="46.638126778s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 13:59:55.762454934 +0000 UTC m=+35.956270852" lastFinishedPulling="2026-04-16 14:00:05.730991506 +0000 UTC m=+45.924807434" observedRunningTime="2026-04-16 14:00:06.637540356 +0000 UTC m=+46.831356290" watchObservedRunningTime="2026-04-16 14:00:06.638126778 +0000 UTC m=+46.831942713" Apr 16 14:00:06.652457 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:06.652420 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-69b87b5899-rxcns" podStartSLOduration=2.697363096 podStartE2EDuration="12.652408862s" podCreationTimestamp="2026-04-16 13:59:54 +0000 UTC" firstStartedPulling="2026-04-16 13:59:55.775076931 +0000 UTC m=+35.968892851" lastFinishedPulling="2026-04-16 14:00:05.730122698 +0000 UTC m=+45.923938617" observedRunningTime="2026-04-16 14:00:06.652112298 +0000 UTC m=+46.845928231" watchObservedRunningTime="2026-04-16 14:00:06.652408862 +0000 UTC m=+46.846224797" Apr 16 14:00:09.104920 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:09.104886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:09.104935 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:09.104977 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105030 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105088 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105094 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:25.105077996 +0000 UTC m=+65.298893909 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105141 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:25.105129625 +0000 UTC m=+65.298945537 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105175 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105192 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 14:00:09.105340 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:09.105233 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:25.105222258 +0000 UTC m=+65.299038171 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 14:00:09.602900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:09.602862 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" event={"ID":"09f5aee5-4aee-460f-a0f2-2434d35686af","Type":"ContainerStarted","Data":"1151d60e8028f1da1ef7ab89609ad0d77a54d282e92deef1ce5f07b7a1df99bd"} Apr 16 14:00:09.602900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:09.602901 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" event={"ID":"09f5aee5-4aee-460f-a0f2-2434d35686af","Type":"ContainerStarted","Data":"fe1e87e744dcce28d8b46d3406a04327bdc0ddaf98b232a6b54071f8ac170ddf"} Apr 16 14:00:09.621800 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:09.621753 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" podStartSLOduration=2.127654869 podStartE2EDuration="15.621739776s" podCreationTimestamp="2026-04-16 13:59:54 +0000 UTC" firstStartedPulling="2026-04-16 13:59:55.774718089 +0000 UTC m=+35.968534002" lastFinishedPulling="2026-04-16 14:00:09.268802989 +0000 UTC m=+49.462618909" observedRunningTime="2026-04-16 14:00:09.621295493 +0000 UTC m=+49.815111429" watchObservedRunningTime="2026-04-16 14:00:09.621739776 +0000 UTC m=+49.815555710" Apr 16 14:00:20.047542 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:20.047514 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-xf96q" Apr 16 14:00:25.117440 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:25.117394 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:25.117452 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:25.117496 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117549 2569 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117593 2569 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117605 2569 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-b96c7f5f9-wlwtx: secret "image-registry-tls" not found Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117606 2569 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117618 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls podName:8e960e93-2ea2-4f10-b638-d01eb132a93d nodeName:}" failed. No retries permitted until 2026-04-16 14:00:57.117598849 +0000 UTC m=+97.311414787 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls") pod "dns-default-pj7tn" (UID: "8e960e93-2ea2-4f10-b638-d01eb132a93d") : secret "dns-default-metrics-tls" not found Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117649 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls podName:8d0a3d85-9abd-4994-ba30-3173587f16f3 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:57.117638354 +0000 UTC m=+97.311454268 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls") pod "image-registry-b96c7f5f9-wlwtx" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3") : secret "image-registry-tls" not found Apr 16 14:00:25.117806 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:25.117665 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert podName:1c7e2532-bb8b-4069-a258-035e33ccef02 nodeName:}" failed. No retries permitted until 2026-04-16 14:00:57.117657289 +0000 UTC m=+97.311473207 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert") pod "ingress-canary-gwt4s" (UID: "1c7e2532-bb8b-4069-a258-035e33ccef02") : secret "canary-serving-cert" not found Apr 16 14:00:26.125755 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:26.125723 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 14:00:26.126140 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:26.125834 2569 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 16 14:00:26.126140 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:00:26.125885 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs podName:1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:30.125871071 +0000 UTC m=+130.319686984 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs") pod "network-metrics-daemon-7pb8h" (UID: "1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17") : secret "metrics-daemon-secret" not found Apr 16 14:00:37.599919 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:37.599891 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-6spc4" Apr 16 14:00:38.288676 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:38.288644 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qt8cq_19600904-44a6-4d4d-aaf8-38af8d1e94b3/dns-node-resolver/0.log" Apr 16 14:00:38.688478 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:38.688414 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7qrkr_bafb680b-a4fd-4b7b-931c-6d0198bbe401/node-ca/0.log" Apr 16 14:00:57.157478 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.157442 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 14:00:57.157842 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.157485 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 14:00:57.157842 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.157614 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:57.161027 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.161001 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/1c7e2532-bb8b-4069-a258-035e33ccef02-cert\") pod \"ingress-canary-gwt4s\" (UID: \"1c7e2532-bb8b-4069-a258-035e33ccef02\") " pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 14:00:57.161133 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.161079 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/8e960e93-2ea2-4f10-b638-d01eb132a93d-metrics-tls\") pod \"dns-default-pj7tn\" (UID: \"8e960e93-2ea2-4f10-b638-d01eb132a93d\") " pod="openshift-dns/dns-default-pj7tn" Apr 16 14:00:57.161171 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.161155 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"image-registry-b96c7f5f9-wlwtx\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:57.430158 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.430074 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-khk5g\"" Apr 16 14:00:57.438422 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.438402 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:57.444216 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.444194 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-wjgsd\"" Apr 16 14:00:57.451120 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.451097 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-pj7tn" Apr 16 14:00:57.454282 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.454253 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pt57w\"" Apr 16 14:00:57.462042 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.462024 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-gwt4s" Apr 16 14:00:57.579669 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.579601 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-b96c7f5f9-wlwtx"] Apr 16 14:00:57.584359 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:00:57.584299 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8d0a3d85_9abd_4994_ba30_3173587f16f3.slice/crio-27cead803187e8d835758763803a22a4efe4ee55bfc836b6c1a6cad827c0a873 WatchSource:0}: Error finding container 27cead803187e8d835758763803a22a4efe4ee55bfc836b6c1a6cad827c0a873: Status 404 returned error can't find the container with id 27cead803187e8d835758763803a22a4efe4ee55bfc836b6c1a6cad827c0a873 Apr 16 14:00:57.594710 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.594663 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-pj7tn"] Apr 16 14:00:57.598295 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:00:57.598245 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e960e93_2ea2_4f10_b638_d01eb132a93d.slice/crio-6e8a14676bc20b422d6b703581cfcdd39db3fedcb9e08aaca41e3a72149383e2 WatchSource:0}: Error finding container 6e8a14676bc20b422d6b703581cfcdd39db3fedcb9e08aaca41e3a72149383e2: Status 404 returned error can't find the container with id 6e8a14676bc20b422d6b703581cfcdd39db3fedcb9e08aaca41e3a72149383e2 Apr 16 14:00:57.613529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.613505 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-gwt4s"] Apr 16 14:00:57.616812 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:00:57.616792 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1c7e2532_bb8b_4069_a258_035e33ccef02.slice/crio-a9c7931502fa16881ca086b1db23c1cb5388bc4f324f3d437cea28d33350f020 WatchSource:0}: Error finding container a9c7931502fa16881ca086b1db23c1cb5388bc4f324f3d437cea28d33350f020: Status 404 returned error can't find the container with id a9c7931502fa16881ca086b1db23c1cb5388bc4f324f3d437cea28d33350f020 Apr 16 14:00:57.715990 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.715911 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" event={"ID":"8d0a3d85-9abd-4994-ba30-3173587f16f3","Type":"ContainerStarted","Data":"d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8"} Apr 16 14:00:57.715990 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.715950 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" event={"ID":"8d0a3d85-9abd-4994-ba30-3173587f16f3","Type":"ContainerStarted","Data":"27cead803187e8d835758763803a22a4efe4ee55bfc836b6c1a6cad827c0a873"} Apr 16 14:00:57.716207 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.716098 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:00:57.716955 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.716933 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gwt4s" event={"ID":"1c7e2532-bb8b-4069-a258-035e33ccef02","Type":"ContainerStarted","Data":"a9c7931502fa16881ca086b1db23c1cb5388bc4f324f3d437cea28d33350f020"} Apr 16 14:00:57.717906 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.717887 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pj7tn" event={"ID":"8e960e93-2ea2-4f10-b638-d01eb132a93d","Type":"ContainerStarted","Data":"6e8a14676bc20b422d6b703581cfcdd39db3fedcb9e08aaca41e3a72149383e2"} Apr 16 14:00:57.745409 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:57.745370 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" podStartSLOduration=97.745357592 podStartE2EDuration="1m37.745357592s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:00:57.744668079 +0000 UTC m=+97.938484015" watchObservedRunningTime="2026-04-16 14:00:57.745357592 +0000 UTC m=+97.939173526" Apr 16 14:00:59.651160 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.651123 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-f6dwk"] Apr 16 14:00:59.654584 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.654544 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.657449 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.657419 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 16 14:00:59.657579 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.657502 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 16 14:00:59.657579 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.657517 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 16 14:00:59.658887 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.658867 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-sdpvr\"" Apr 16 14:00:59.659030 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.658929 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 16 14:00:59.664983 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.664936 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f6dwk"] Apr 16 14:00:59.724822 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.724739 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-gwt4s" event={"ID":"1c7e2532-bb8b-4069-a258-035e33ccef02","Type":"ContainerStarted","Data":"4b68e1fbca0a5e6178f42f1b4817d8bba5ca9025be1286303e93dbe68061654c"} Apr 16 14:00:59.750809 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.750751 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-gwt4s" podStartSLOduration=64.918818694 podStartE2EDuration="1m6.750738394s" podCreationTimestamp="2026-04-16 13:59:53 +0000 UTC" firstStartedPulling="2026-04-16 14:00:57.618596317 +0000 UTC m=+97.812412230" lastFinishedPulling="2026-04-16 14:00:59.450516 +0000 UTC m=+99.644331930" observedRunningTime="2026-04-16 14:00:59.749615382 +0000 UTC m=+99.943431317" watchObservedRunningTime="2026-04-16 14:00:59.750738394 +0000 UTC m=+99.944554329" Apr 16 14:00:59.779574 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.779536 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.779729 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.779590 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-data-volume\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.779729 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.779629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwcjh\" (UniqueName: \"kubernetes.io/projected/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-kube-api-access-nwcjh\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.779729 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.779654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-crio-socket\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.779847 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.779766 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.880998 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.880955 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881188 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881188 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881144 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-data-volume\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881188 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881170 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwcjh\" (UniqueName: \"kubernetes.io/projected/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-kube-api-access-nwcjh\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881415 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881215 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-crio-socket\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881415 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881342 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-crio-socket\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881581 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-data-volume\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.881718 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.881698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.883913 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.883889 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.890809 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.890785 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwcjh\" (UniqueName: \"kubernetes.io/projected/ad87cd87-9cdc-4f67-9399-3c3befc3e0d4-kube-api-access-nwcjh\") pod \"insights-runtime-extractor-f6dwk\" (UID: \"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4\") " pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:00:59.966986 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:00:59.966948 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-f6dwk" Apr 16 14:01:00.103641 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:00.103609 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-f6dwk"] Apr 16 14:01:00.106955 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:01:00.106925 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad87cd87_9cdc_4f67_9399_3c3befc3e0d4.slice/crio-dc24ad51c16f143fe8695b97e60709ed0a20f198423eaca0e1266e79b2f2fb7c WatchSource:0}: Error finding container dc24ad51c16f143fe8695b97e60709ed0a20f198423eaca0e1266e79b2f2fb7c: Status 404 returned error can't find the container with id dc24ad51c16f143fe8695b97e60709ed0a20f198423eaca0e1266e79b2f2fb7c Apr 16 14:01:00.729208 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:00.729170 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pj7tn" event={"ID":"8e960e93-2ea2-4f10-b638-d01eb132a93d","Type":"ContainerStarted","Data":"5eb8cc6bddebe9f1348860ed5726dbae7fdd11d2aec9c98a9979ddedc82be286"} Apr 16 14:01:00.730528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:00.730500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6dwk" event={"ID":"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4","Type":"ContainerStarted","Data":"80b649ee57397b9485ade5342d5d6c66eb1becaa72e81842333a3d1965c4f4df"} Apr 16 14:01:00.730664 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:00.730537 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6dwk" event={"ID":"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4","Type":"ContainerStarted","Data":"dc24ad51c16f143fe8695b97e60709ed0a20f198423eaca0e1266e79b2f2fb7c"} Apr 16 14:01:01.735204 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:01.735169 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-pj7tn" event={"ID":"8e960e93-2ea2-4f10-b638-d01eb132a93d","Type":"ContainerStarted","Data":"a5d81376973544fc9c78d75619dc0818e70e532bface85f270d73e2a43fc8e2b"} Apr 16 14:01:01.735672 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:01.735383 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-pj7tn" Apr 16 14:01:01.757694 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:01.757652 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-pj7tn" podStartSLOduration=65.797079616 podStartE2EDuration="1m8.757639113s" podCreationTimestamp="2026-04-16 13:59:53 +0000 UTC" firstStartedPulling="2026-04-16 14:00:57.59998793 +0000 UTC m=+97.793803843" lastFinishedPulling="2026-04-16 14:01:00.560547427 +0000 UTC m=+100.754363340" observedRunningTime="2026-04-16 14:01:01.756512126 +0000 UTC m=+101.950328060" watchObservedRunningTime="2026-04-16 14:01:01.757639113 +0000 UTC m=+101.951455048" Apr 16 14:01:03.741245 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:03.741219 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6dwk" event={"ID":"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4","Type":"ContainerStarted","Data":"4dda76e278e36437387cfd3432608d9f6a1ab5551dc40f992f130aff1f96ae71"} Apr 16 14:01:05.747779 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:05.747736 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-f6dwk" event={"ID":"ad87cd87-9cdc-4f67-9399-3c3befc3e0d4","Type":"ContainerStarted","Data":"458efe4598a94e01a4be842ee08828d721b446d9ee2c66220da4f6bcf8f2e196"} Apr 16 14:01:05.776923 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:05.776878 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-f6dwk" podStartSLOduration=1.786275552 podStartE2EDuration="6.776865256s" podCreationTimestamp="2026-04-16 14:00:59 +0000 UTC" firstStartedPulling="2026-04-16 14:01:00.165861958 +0000 UTC m=+100.359677871" lastFinishedPulling="2026-04-16 14:01:05.156451648 +0000 UTC m=+105.350267575" observedRunningTime="2026-04-16 14:01:05.775113078 +0000 UTC m=+105.968929013" watchObservedRunningTime="2026-04-16 14:01:05.776865256 +0000 UTC m=+105.970681169" Apr 16 14:01:11.740076 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:11.740047 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-pj7tn" Apr 16 14:01:14.056261 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.056228 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-7j9ch"] Apr 16 14:01:14.060674 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.060652 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.073889 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.073872 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 16 14:01:14.074002 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.073984 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 16 14:01:14.074050 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.073983 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 16 14:01:14.074050 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.074028 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 16 14:01:14.075383 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.075370 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 16 14:01:14.075425 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.075375 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 16 14:01:14.078044 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.078030 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-w66d6\"" Apr 16 14:01:14.190422 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190400 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190614 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190438 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-root\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190614 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-sys\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190614 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190510 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6rrp\" (UniqueName: \"kubernetes.io/projected/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-kube-api-access-v6rrp\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190614 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-textfile\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190777 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190620 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-wtmp\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190777 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190654 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190777 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190677 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-accelerators-collector-config\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.190777 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.190706 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-metrics-client-ca\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.291823 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291799 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-wtmp\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.291941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291829 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.291941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-accelerators-collector-config\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.291941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291872 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-metrics-client-ca\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.291941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291897 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.291941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-root\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291960 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-sys\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291968 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-wtmp\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.291984 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6rrp\" (UniqueName: \"kubernetes.io/projected/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-kube-api-access-v6rrp\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.292006 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-textfile\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.292027 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-root\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.292040 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-sys\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:01:14.292062 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:14.292192 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:01:14.292177 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls podName:4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:14.792149944 +0000 UTC m=+114.985965861 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls") pod "node-exporter-7j9ch" (UID: "4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914") : secret "node-exporter-tls" not found Apr 16 14:01:14.292597 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.292378 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-textfile\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292597 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.292532 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-accelerators-collector-config\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.292597 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.292547 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-metrics-client-ca\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.294210 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.294194 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.301513 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.301492 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6rrp\" (UniqueName: \"kubernetes.io/projected/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-kube-api-access-v6rrp\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.796413 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:14.796381 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:14.796573 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:01:14.796492 2569 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 16 14:01:14.796573 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:01:14.796548 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls podName:4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914 nodeName:}" failed. No retries permitted until 2026-04-16 14:01:15.796534961 +0000 UTC m=+115.990350874 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls") pod "node-exporter-7j9ch" (UID: "4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914") : secret "node-exporter-tls" not found Apr 16 14:01:15.803435 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:15.803402 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:15.805666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:15.805643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914-node-exporter-tls\") pod \"node-exporter-7j9ch\" (UID: \"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914\") " pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:15.869745 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:15.869719 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-7j9ch" Apr 16 14:01:15.877314 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:01:15.877289 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4c2a5a7c_50f6_4b62_8d7c_4f392ea3a914.slice/crio-0901d5056037e1f18f2ce440680a7d4b2e428632747803e63893e593489f05ae WatchSource:0}: Error finding container 0901d5056037e1f18f2ce440680a7d4b2e428632747803e63893e593489f05ae: Status 404 returned error can't find the container with id 0901d5056037e1f18f2ce440680a7d4b2e428632747803e63893e593489f05ae Apr 16 14:01:16.773689 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:16.773659 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7j9ch" event={"ID":"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914","Type":"ContainerStarted","Data":"0901d5056037e1f18f2ce440680a7d4b2e428632747803e63893e593489f05ae"} Apr 16 14:01:17.442865 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:17.442820 2569 patch_prober.go:28] interesting pod/image-registry-b96c7f5f9-wlwtx container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 16 14:01:17.443299 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:17.442910 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" podUID="8d0a3d85-9abd-4994-ba30-3173587f16f3" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:01:18.725022 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:18.724994 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:01:18.783288 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:18.783243 2569 generic.go:358] "Generic (PLEG): container finished" podID="4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914" containerID="3169282b57de6c63f573fe82ce43a5a543331a32e85e568dc0ef8ec44569d35a" exitCode=0 Apr 16 14:01:18.783438 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:18.783306 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7j9ch" event={"ID":"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914","Type":"ContainerDied","Data":"3169282b57de6c63f573fe82ce43a5a543331a32e85e568dc0ef8ec44569d35a"} Apr 16 14:01:19.787998 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:19.787966 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7j9ch" event={"ID":"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914","Type":"ContainerStarted","Data":"b0cca5d55648f0a7ab8e0aad760980ec9bd4801d646b8e417390f2ce52250cc6"} Apr 16 14:01:19.787998 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:19.788000 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-7j9ch" event={"ID":"4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914","Type":"ContainerStarted","Data":"225913b0f4708b64f458b178f7e61b98f4249edd6ed71cc8a166cbeb76c765c6"} Apr 16 14:01:19.810518 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:19.810467 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-7j9ch" podStartSLOduration=3.915910428 podStartE2EDuration="5.810453647s" podCreationTimestamp="2026-04-16 14:01:14 +0000 UTC" firstStartedPulling="2026-04-16 14:01:15.87904774 +0000 UTC m=+116.072863655" lastFinishedPulling="2026-04-16 14:01:17.773590961 +0000 UTC m=+117.967406874" observedRunningTime="2026-04-16 14:01:19.809029913 +0000 UTC m=+120.002845847" watchObservedRunningTime="2026-04-16 14:01:19.810453647 +0000 UTC m=+120.004269581" Apr 16 14:01:21.987109 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:21.987079 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b96c7f5f9-wlwtx"] Apr 16 14:01:30.199647 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:30.199588 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 14:01:30.201836 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:30.201808 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17-metrics-certs\") pod \"network-metrics-daemon-7pb8h\" (UID: \"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17\") " pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 14:01:30.267823 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:30.267799 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-p5bbb\"" Apr 16 14:01:30.275480 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:30.275462 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-7pb8h" Apr 16 14:01:30.395568 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:30.395519 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-7pb8h"] Apr 16 14:01:30.399062 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:01:30.399033 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e697f2d_12ee_44d4_bd4b_71aa2cf8ee17.slice/crio-359018266d490dbd0b20fae35b9a5f140ce5639cdca70d8e700a294fedc726d5 WatchSource:0}: Error finding container 359018266d490dbd0b20fae35b9a5f140ce5639cdca70d8e700a294fedc726d5: Status 404 returned error can't find the container with id 359018266d490dbd0b20fae35b9a5f140ce5639cdca70d8e700a294fedc726d5 Apr 16 14:01:30.815312 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:30.815277 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7pb8h" event={"ID":"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17","Type":"ContainerStarted","Data":"359018266d490dbd0b20fae35b9a5f140ce5639cdca70d8e700a294fedc726d5"} Apr 16 14:01:31.818988 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:31.818906 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7pb8h" event={"ID":"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17","Type":"ContainerStarted","Data":"2966177cf49215035550069996ec8b8a29c265c5d5bc4fccac0b53454cf6c69a"} Apr 16 14:01:31.818988 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:31.818940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-7pb8h" event={"ID":"1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17","Type":"ContainerStarted","Data":"c87f6a91334bdc583bdcbd131e30623c06e04a45f80d8b1a47240ba966188c98"} Apr 16 14:01:31.840816 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:31.840769 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-7pb8h" podStartSLOduration=130.707202052 podStartE2EDuration="2m11.840756885s" podCreationTimestamp="2026-04-16 13:59:20 +0000 UTC" firstStartedPulling="2026-04-16 14:01:30.400759044 +0000 UTC m=+130.594574958" lastFinishedPulling="2026-04-16 14:01:31.534313878 +0000 UTC m=+131.728129791" observedRunningTime="2026-04-16 14:01:31.839301271 +0000 UTC m=+132.033117206" watchObservedRunningTime="2026-04-16 14:01:31.840756885 +0000 UTC m=+132.034572861" Apr 16 14:01:35.315588 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:35.315545 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" podUID="09f5aee5-4aee-460f-a0f2-2434d35686af" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:45.315254 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:45.315212 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" podUID="09f5aee5-4aee-460f-a0f2-2434d35686af" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:47.005430 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.005363 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" podUID="8d0a3d85-9abd-4994-ba30-3173587f16f3" containerName="registry" containerID="cri-o://d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8" gracePeriod=30 Apr 16 14:01:47.236545 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.236522 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:01:47.315236 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315160 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8bjdk\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-kube-api-access-8bjdk\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315236 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315201 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-certificates\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315236 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315228 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315567 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315256 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d0a3d85-9abd-4994-ba30-3173587f16f3-ca-trust-extracted\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315567 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315339 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-installation-pull-secrets\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315567 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315366 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-bound-sa-token\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315567 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315392 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-image-registry-private-configuration\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315567 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315417 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-trusted-ca\") pod \"8d0a3d85-9abd-4994-ba30-3173587f16f3\" (UID: \"8d0a3d85-9abd-4994-ba30-3173587f16f3\") " Apr 16 14:01:47.315810 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315636 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:47.315974 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.315949 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:01:47.317865 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.317834 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:47.317865 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.317844 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:47.318011 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.317988 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-kube-api-access-8bjdk" (OuterVolumeSpecName: "kube-api-access-8bjdk") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "kube-api-access-8bjdk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:47.318069 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.318025 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:01:47.318360 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.318341 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:01:47.323910 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.323889 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8d0a3d85-9abd-4994-ba30-3173587f16f3-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "8d0a3d85-9abd-4994-ba30-3173587f16f3" (UID: "8d0a3d85-9abd-4994-ba30-3173587f16f3"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:01:47.416091 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416063 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8bjdk\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-kube-api-access-8bjdk\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416091 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416087 2569 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-certificates\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416098 2569 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-registry-tls\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416106 2569 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/8d0a3d85-9abd-4994-ba30-3173587f16f3-ca-trust-extracted\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416115 2569 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-installation-pull-secrets\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416124 2569 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8d0a3d85-9abd-4994-ba30-3173587f16f3-bound-sa-token\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416133 2569 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/8d0a3d85-9abd-4994-ba30-3173587f16f3-image-registry-private-configuration\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.416250 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.416141 2569 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8d0a3d85-9abd-4994-ba30-3173587f16f3-trusted-ca\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:01:47.859542 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.859508 2569 generic.go:358] "Generic (PLEG): container finished" podID="8d0a3d85-9abd-4994-ba30-3173587f16f3" containerID="d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8" exitCode=0 Apr 16 14:01:47.859714 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.859570 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" Apr 16 14:01:47.859714 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.859599 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" event={"ID":"8d0a3d85-9abd-4994-ba30-3173587f16f3","Type":"ContainerDied","Data":"d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8"} Apr 16 14:01:47.859714 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.859641 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-b96c7f5f9-wlwtx" event={"ID":"8d0a3d85-9abd-4994-ba30-3173587f16f3","Type":"ContainerDied","Data":"27cead803187e8d835758763803a22a4efe4ee55bfc836b6c1a6cad827c0a873"} Apr 16 14:01:47.859714 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.859656 2569 scope.go:117] "RemoveContainer" containerID="d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8" Apr 16 14:01:47.867366 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.867345 2569 scope.go:117] "RemoveContainer" containerID="d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8" Apr 16 14:01:47.867843 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:01:47.867819 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8\": container with ID starting with d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8 not found: ID does not exist" containerID="d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8" Apr 16 14:01:47.867924 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.867849 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8"} err="failed to get container status \"d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8\": rpc error: code = NotFound desc = could not find container \"d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8\": container with ID starting with d5f34dab1c2027c4099b40fde920cdb6007e6ae51cc14e5a07fe1aa515bdf6d8 not found: ID does not exist" Apr 16 14:01:47.881328 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.881304 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-b96c7f5f9-wlwtx"] Apr 16 14:01:47.882651 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:47.882620 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-b96c7f5f9-wlwtx"] Apr 16 14:01:48.346323 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:48.346260 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8d0a3d85-9abd-4994-ba30-3173587f16f3" path="/var/lib/kubelet/pods/8d0a3d85-9abd-4994-ba30-3173587f16f3/volumes" Apr 16 14:01:55.315060 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.315021 2569 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" podUID="09f5aee5-4aee-460f-a0f2-2434d35686af" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 16 14:01:55.315636 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.315103 2569 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" Apr 16 14:01:55.315781 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.315758 2569 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"1151d60e8028f1da1ef7ab89609ad0d77a54d282e92deef1ce5f07b7a1df99bd"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 16 14:01:55.315850 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.315810 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" podUID="09f5aee5-4aee-460f-a0f2-2434d35686af" containerName="service-proxy" containerID="cri-o://1151d60e8028f1da1ef7ab89609ad0d77a54d282e92deef1ce5f07b7a1df99bd" gracePeriod=30 Apr 16 14:01:55.881370 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.881330 2569 generic.go:358] "Generic (PLEG): container finished" podID="09f5aee5-4aee-460f-a0f2-2434d35686af" containerID="1151d60e8028f1da1ef7ab89609ad0d77a54d282e92deef1ce5f07b7a1df99bd" exitCode=2 Apr 16 14:01:55.881370 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.881370 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" event={"ID":"09f5aee5-4aee-460f-a0f2-2434d35686af","Type":"ContainerDied","Data":"1151d60e8028f1da1ef7ab89609ad0d77a54d282e92deef1ce5f07b7a1df99bd"} Apr 16 14:01:55.881617 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:01:55.881404 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-79bdb79488-bjfvc" event={"ID":"09f5aee5-4aee-460f-a0f2-2434d35686af","Type":"ContainerStarted","Data":"5ec1be43fda541a8fac92061db0e66ae704fa0fe57b8932bcd04b80cdd3535bd"} Apr 16 14:04:20.236538 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:04:20.236509 2569 kubelet.go:1628] "Image garbage collection succeeded" Apr 16 14:07:54.423386 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.423348 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj"] Apr 16 14:07:54.423949 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.423641 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8d0a3d85-9abd-4994-ba30-3173587f16f3" containerName="registry" Apr 16 14:07:54.423949 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.423657 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="8d0a3d85-9abd-4994-ba30-3173587f16f3" containerName="registry" Apr 16 14:07:54.423949 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.423730 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="8d0a3d85-9abd-4994-ba30-3173587f16f3" containerName="registry" Apr 16 14:07:54.426454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.426434 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.429295 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.429258 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 16 14:07:54.429390 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.429302 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-m5k4v\"" Apr 16 14:07:54.429390 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.429313 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 16 14:07:54.429390 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.429311 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 16 14:07:54.430401 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.430385 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-metrics-apiserver-certs\"" Apr 16 14:07:54.430495 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.430420 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 16 14:07:54.433695 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.433676 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj"] Apr 16 14:07:54.510795 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.510765 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/522d0505-6bef-4c80-83ab-a4b43cf0029c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.510937 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.510802 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.510937 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.510830 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2rnjd\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-kube-api-access-2rnjd\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.611374 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.611344 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/522d0505-6bef-4c80-83ab-a4b43cf0029c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.611508 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.611384 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.611508 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:54.611475 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:07:54.611508 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:54.611489 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:07:54.611508 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:54.611507 2569 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 14:07:54.611649 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.611514 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2rnjd\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-kube-api-access-2rnjd\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.611649 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:54.611531 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:07:54.611649 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:54.611597 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates podName:522d0505-6bef-4c80-83ab-a4b43cf0029c nodeName:}" failed. No retries permitted until 2026-04-16 14:07:55.111577483 +0000 UTC m=+515.305393397 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates") pod "keda-metrics-apiserver-7c9f485588-kzkcj" (UID: "522d0505-6bef-4c80-83ab-a4b43cf0029c") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:07:54.612257 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.612240 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"temp-vol\" (UniqueName: \"kubernetes.io/empty-dir/522d0505-6bef-4c80-83ab-a4b43cf0029c-temp-vol\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:54.620666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:54.620643 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2rnjd\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-kube-api-access-2rnjd\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:55.117057 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:55.117019 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:55.117244 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:55.117172 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:07:55.117244 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:55.117192 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:07:55.117244 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:55.117211 2569 projected.go:264] Couldn't get secret openshift-keda/keda-metrics-apiserver-certs: secret "keda-metrics-apiserver-certs" not found Apr 16 14:07:55.117244 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:55.117229 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj: [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:07:55.117480 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:55.117341 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates podName:522d0505-6bef-4c80-83ab-a4b43cf0029c nodeName:}" failed. No retries permitted until 2026-04-16 14:07:56.117319877 +0000 UTC m=+516.311135803 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates") pod "keda-metrics-apiserver-7c9f485588-kzkcj" (UID: "522d0505-6bef-4c80-83ab-a4b43cf0029c") : [references non-existent secret key: tls.crt, secret "keda-metrics-apiserver-certs" not found] Apr 16 14:07:56.124866 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:56.124826 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:56.125259 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:56.124976 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:07:56.125259 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:56.124994 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:07:56.125259 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:56.125013 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj: references non-existent secret key: tls.crt Apr 16 14:07:56.125259 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:56.125064 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates podName:522d0505-6bef-4c80-83ab-a4b43cf0029c nodeName:}" failed. No retries permitted until 2026-04-16 14:07:58.12505065 +0000 UTC m=+518.318866564 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates") pod "keda-metrics-apiserver-7c9f485588-kzkcj" (UID: "522d0505-6bef-4c80-83ab-a4b43cf0029c") : references non-existent secret key: tls.crt Apr 16 14:07:58.139619 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:07:58.139581 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:07:58.140002 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:58.139719 2569 secret.go:281] references non-existent secret key: tls.crt Apr 16 14:07:58.140002 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:58.139737 2569 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: tls.crt Apr 16 14:07:58.140002 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:58.139755 2569 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj: references non-existent secret key: tls.crt Apr 16 14:07:58.140002 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:07:58.139819 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates podName:522d0505-6bef-4c80-83ab-a4b43cf0029c nodeName:}" failed. No retries permitted until 2026-04-16 14:08:02.139802779 +0000 UTC m=+522.333618693 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates") pod "keda-metrics-apiserver-7c9f485588-kzkcj" (UID: "522d0505-6bef-4c80-83ab-a4b43cf0029c") : references non-existent secret key: tls.crt Apr 16 14:08:02.169981 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:02.169945 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:08:02.172474 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:02.172456 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/522d0505-6bef-4c80-83ab-a4b43cf0029c-certificates\") pod \"keda-metrics-apiserver-7c9f485588-kzkcj\" (UID: \"522d0505-6bef-4c80-83ab-a4b43cf0029c\") " pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:08:02.236512 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:02.236473 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:08:02.350768 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:02.350737 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj"] Apr 16 14:08:02.354711 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:08:02.354681 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod522d0505_6bef_4c80_83ab_a4b43cf0029c.slice/crio-34dc6d1371c772ab9b03d7be063f34b33b66f2dba234c9ff3d9b8fdc247b122b WatchSource:0}: Error finding container 34dc6d1371c772ab9b03d7be063f34b33b66f2dba234c9ff3d9b8fdc247b122b: Status 404 returned error can't find the container with id 34dc6d1371c772ab9b03d7be063f34b33b66f2dba234c9ff3d9b8fdc247b122b Apr 16 14:08:02.355895 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:02.355881 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:08:02.773137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:02.773105 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" event={"ID":"522d0505-6bef-4c80-83ab-a4b43cf0029c","Type":"ContainerStarted","Data":"34dc6d1371c772ab9b03d7be063f34b33b66f2dba234c9ff3d9b8fdc247b122b"} Apr 16 14:08:05.783123 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:05.783096 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" event={"ID":"522d0505-6bef-4c80-83ab-a4b43cf0029c","Type":"ContainerStarted","Data":"38e5a3b3a9e58c7063e6347290dc658ca9a33cb9867c1e58f4080151bd14445f"} Apr 16 14:08:05.783465 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:05.783218 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:08:05.799870 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:05.799821 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" podStartSLOduration=8.444790989 podStartE2EDuration="11.799805626s" podCreationTimestamp="2026-04-16 14:07:54 +0000 UTC" firstStartedPulling="2026-04-16 14:08:02.356002033 +0000 UTC m=+522.549817945" lastFinishedPulling="2026-04-16 14:08:05.711016663 +0000 UTC m=+525.904832582" observedRunningTime="2026-04-16 14:08:05.798536552 +0000 UTC m=+525.992352675" watchObservedRunningTime="2026-04-16 14:08:05.799805626 +0000 UTC m=+525.993621563" Apr 16 14:08:16.791943 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:16.791865 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-metrics-apiserver-7c9f485588-kzkcj" Apr 16 14:08:59.305836 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.305799 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl"] Apr 16 14:08:59.308837 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.308821 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.311668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.311647 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 16 14:08:59.311791 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.311763 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:08:59.311791 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.311763 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-jc6md\"" Apr 16 14:08:59.321313 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.321292 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl"] Apr 16 14:08:59.342284 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.342234 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26b371e4-3de4-4f66-9399-11546a06165d-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-68fxl\" (UID: \"26b371e4-3de4-4f66-9399-11546a06165d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.342374 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.342336 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-g9zdz\" (UniqueName: \"kubernetes.io/projected/26b371e4-3de4-4f66-9399-11546a06165d-kube-api-access-g9zdz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-68fxl\" (UID: \"26b371e4-3de4-4f66-9399-11546a06165d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.443287 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.443241 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26b371e4-3de4-4f66-9399-11546a06165d-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-68fxl\" (UID: \"26b371e4-3de4-4f66-9399-11546a06165d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.443429 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.443310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-g9zdz\" (UniqueName: \"kubernetes.io/projected/26b371e4-3de4-4f66-9399-11546a06165d-kube-api-access-g9zdz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-68fxl\" (UID: \"26b371e4-3de4-4f66-9399-11546a06165d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.443612 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.443594 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/26b371e4-3de4-4f66-9399-11546a06165d-tmp\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-68fxl\" (UID: \"26b371e4-3de4-4f66-9399-11546a06165d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.452091 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.452064 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-g9zdz\" (UniqueName: \"kubernetes.io/projected/26b371e4-3de4-4f66-9399-11546a06165d-kube-api-access-g9zdz\") pod \"cert-manager-operator-controller-manager-7ccfb878b5-68fxl\" (UID: \"26b371e4-3de4-4f66-9399-11546a06165d\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.618030 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.617935 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" Apr 16 14:08:59.741838 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.741809 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl"] Apr 16 14:08:59.745073 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:08:59.745048 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26b371e4_3de4_4f66_9399_11546a06165d.slice/crio-05c1d755c18154eb0ec5ac018d6dfa0390bf2a16dc5186be7e7317c42c7a7446 WatchSource:0}: Error finding container 05c1d755c18154eb0ec5ac018d6dfa0390bf2a16dc5186be7e7317c42c7a7446: Status 404 returned error can't find the container with id 05c1d755c18154eb0ec5ac018d6dfa0390bf2a16dc5186be7e7317c42c7a7446 Apr 16 14:08:59.915584 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:08:59.915493 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" event={"ID":"26b371e4-3de4-4f66-9399-11546a06165d","Type":"ContainerStarted","Data":"05c1d755c18154eb0ec5ac018d6dfa0390bf2a16dc5186be7e7317c42c7a7446"} Apr 16 14:09:06.935527 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:06.935443 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" event={"ID":"26b371e4-3de4-4f66-9399-11546a06165d","Type":"ContainerStarted","Data":"dfd84911aad839d53336bb2b4951d31697d8158c56462579d101971f39c08776"} Apr 16 14:09:06.961238 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:06.961187 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-7ccfb878b5-68fxl" podStartSLOduration=1.096912267 podStartE2EDuration="7.961172936s" podCreationTimestamp="2026-04-16 14:08:59 +0000 UTC" firstStartedPulling="2026-04-16 14:08:59.747599611 +0000 UTC m=+579.941415523" lastFinishedPulling="2026-04-16 14:09:06.611860276 +0000 UTC m=+586.805676192" observedRunningTime="2026-04-16 14:09:06.959841628 +0000 UTC m=+587.153657563" watchObservedRunningTime="2026-04-16 14:09:06.961172936 +0000 UTC m=+587.154988874" Apr 16 14:09:13.284151 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.284118 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-l6r7z"] Apr 16 14:09:13.287171 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.287157 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.290673 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.290648 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 16 14:09:13.292055 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.292036 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 16 14:09:13.292164 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.292035 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-2qrj2\"" Apr 16 14:09:13.304042 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.304024 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-l6r7z"] Apr 16 14:09:13.338744 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.338719 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b7vlr\" (UniqueName: \"kubernetes.io/projected/92cd7aff-30e9-47b5-b243-c40a9746add4-kube-api-access-b7vlr\") pod \"cert-manager-cainjector-8966b78d4-l6r7z\" (UID: \"92cd7aff-30e9-47b5-b243-c40a9746add4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.338842 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.338772 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92cd7aff-30e9-47b5-b243-c40a9746add4-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-l6r7z\" (UID: \"92cd7aff-30e9-47b5-b243-c40a9746add4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.439800 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.439770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92cd7aff-30e9-47b5-b243-c40a9746add4-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-l6r7z\" (UID: \"92cd7aff-30e9-47b5-b243-c40a9746add4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.439908 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.439815 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b7vlr\" (UniqueName: \"kubernetes.io/projected/92cd7aff-30e9-47b5-b243-c40a9746add4-kube-api-access-b7vlr\") pod \"cert-manager-cainjector-8966b78d4-l6r7z\" (UID: \"92cd7aff-30e9-47b5-b243-c40a9746add4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.451131 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.451104 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/92cd7aff-30e9-47b5-b243-c40a9746add4-bound-sa-token\") pod \"cert-manager-cainjector-8966b78d4-l6r7z\" (UID: \"92cd7aff-30e9-47b5-b243-c40a9746add4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.451348 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.451328 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b7vlr\" (UniqueName: \"kubernetes.io/projected/92cd7aff-30e9-47b5-b243-c40a9746add4-kube-api-access-b7vlr\") pod \"cert-manager-cainjector-8966b78d4-l6r7z\" (UID: \"92cd7aff-30e9-47b5-b243-c40a9746add4\") " pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.595953 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.595869 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" Apr 16 14:09:13.710445 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.710413 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-8966b78d4-l6r7z"] Apr 16 14:09:13.713758 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:09:13.713727 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod92cd7aff_30e9_47b5_b243_c40a9746add4.slice/crio-e59963470572a060332801de7a136f43eab5193d99b9bc208e9cad794e20a792 WatchSource:0}: Error finding container e59963470572a060332801de7a136f43eab5193d99b9bc208e9cad794e20a792: Status 404 returned error can't find the container with id e59963470572a060332801de7a136f43eab5193d99b9bc208e9cad794e20a792 Apr 16 14:09:13.952558 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:13.952480 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" event={"ID":"92cd7aff-30e9-47b5-b243-c40a9746add4","Type":"ContainerStarted","Data":"e59963470572a060332801de7a136f43eab5193d99b9bc208e9cad794e20a792"} Apr 16 14:09:17.965073 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:17.964990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" event={"ID":"92cd7aff-30e9-47b5-b243-c40a9746add4","Type":"ContainerStarted","Data":"bc6b1f490313eda77196be225a34214b848c2158ac37951c0d885fc7938440d4"} Apr 16 14:09:17.981502 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:17.981454 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-8966b78d4-l6r7z" podStartSLOduration=1.027967897 podStartE2EDuration="4.98144173s" podCreationTimestamp="2026-04-16 14:09:13 +0000 UTC" firstStartedPulling="2026-04-16 14:09:13.715430487 +0000 UTC m=+593.909246403" lastFinishedPulling="2026-04-16 14:09:17.668904321 +0000 UTC m=+597.862720236" observedRunningTime="2026-04-16 14:09:17.980314442 +0000 UTC m=+598.174130376" watchObservedRunningTime="2026-04-16 14:09:17.98144173 +0000 UTC m=+598.175257664" Apr 16 14:09:21.862985 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.862950 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2"] Apr 16 14:09:21.865979 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.865963 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:21.870775 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.870741 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 16 14:09:21.872035 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.871935 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 16 14:09:21.872120 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.871936 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-plskr\"" Apr 16 14:09:21.877679 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.877658 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2"] Apr 16 14:09:21.897112 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.897088 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kcqz\" (UniqueName: \"kubernetes.io/projected/296ccc38-fbad-4869-915f-9dc8608b6e24-kube-api-access-8kcqz\") pod \"openshift-lws-operator-bfc7f696d-rs8j2\" (UID: \"296ccc38-fbad-4869-915f-9dc8608b6e24\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:21.897219 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.897122 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/296ccc38-fbad-4869-915f-9dc8608b6e24-tmp\") pod \"openshift-lws-operator-bfc7f696d-rs8j2\" (UID: \"296ccc38-fbad-4869-915f-9dc8608b6e24\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:21.997632 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.997605 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kcqz\" (UniqueName: \"kubernetes.io/projected/296ccc38-fbad-4869-915f-9dc8608b6e24-kube-api-access-8kcqz\") pod \"openshift-lws-operator-bfc7f696d-rs8j2\" (UID: \"296ccc38-fbad-4869-915f-9dc8608b6e24\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:21.997764 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.997640 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/296ccc38-fbad-4869-915f-9dc8608b6e24-tmp\") pod \"openshift-lws-operator-bfc7f696d-rs8j2\" (UID: \"296ccc38-fbad-4869-915f-9dc8608b6e24\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:21.998010 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:21.997994 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/296ccc38-fbad-4869-915f-9dc8608b6e24-tmp\") pod \"openshift-lws-operator-bfc7f696d-rs8j2\" (UID: \"296ccc38-fbad-4869-915f-9dc8608b6e24\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:22.010212 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:22.010182 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kcqz\" (UniqueName: \"kubernetes.io/projected/296ccc38-fbad-4869-915f-9dc8608b6e24-kube-api-access-8kcqz\") pod \"openshift-lws-operator-bfc7f696d-rs8j2\" (UID: \"296ccc38-fbad-4869-915f-9dc8608b6e24\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:22.175100 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:22.175021 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" Apr 16 14:09:22.291124 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:22.291091 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2"] Apr 16 14:09:22.294140 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:09:22.294106 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod296ccc38_fbad_4869_915f_9dc8608b6e24.slice/crio-f7de55b4fc891fbb77b0d6733e2b908a16f80bb387660bd90c57f6d1919eed06 WatchSource:0}: Error finding container f7de55b4fc891fbb77b0d6733e2b908a16f80bb387660bd90c57f6d1919eed06: Status 404 returned error can't find the container with id f7de55b4fc891fbb77b0d6733e2b908a16f80bb387660bd90c57f6d1919eed06 Apr 16 14:09:22.979703 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:22.979667 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" event={"ID":"296ccc38-fbad-4869-915f-9dc8608b6e24","Type":"ContainerStarted","Data":"f7de55b4fc891fbb77b0d6733e2b908a16f80bb387660bd90c57f6d1919eed06"} Apr 16 14:09:25.989454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:25.989417 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" event={"ID":"296ccc38-fbad-4869-915f-9dc8608b6e24","Type":"ContainerStarted","Data":"1d1eeb0aeee987f30d385ee4d1e2dfc64cf4ddba1b595d05c140eee7f148277e"} Apr 16 14:09:26.005736 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:26.005685 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-rs8j2" podStartSLOduration=2.359175835 podStartE2EDuration="5.005669659s" podCreationTimestamp="2026-04-16 14:09:21 +0000 UTC" firstStartedPulling="2026-04-16 14:09:22.296217938 +0000 UTC m=+602.490033854" lastFinishedPulling="2026-04-16 14:09:24.942711765 +0000 UTC m=+605.136527678" observedRunningTime="2026-04-16 14:09:26.004862202 +0000 UTC m=+606.198678162" watchObservedRunningTime="2026-04-16 14:09:26.005669659 +0000 UTC m=+606.199485597" Apr 16 14:09:55.943006 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:55.942927 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cf88r"] Apr 16 14:09:55.946300 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:55.946259 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:55.949676 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:55.949656 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"kube-root-ca.crt\"" Apr 16 14:09:55.949822 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:55.949770 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-operators\"/\"servicemesh-operator3-dockercfg-t7x5m\"" Apr 16 14:09:55.950308 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:55.950293 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-operators\"/\"openshift-service-ca.crt\"" Apr 16 14:09:55.977598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:55.977570 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cf88r"] Apr 16 14:09:56.052363 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.052331 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fcw2j\" (UniqueName: \"kubernetes.io/projected/d456e4d0-0a41-44a1-964e-146197b11df8-kube-api-access-fcw2j\") pod \"servicemesh-operator3-55f49c5f94-cf88r\" (UID: \"d456e4d0-0a41-44a1-964e-146197b11df8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.052501 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.052391 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d456e4d0-0a41-44a1-964e-146197b11df8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cf88r\" (UID: \"d456e4d0-0a41-44a1-964e-146197b11df8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.153728 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.153699 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d456e4d0-0a41-44a1-964e-146197b11df8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cf88r\" (UID: \"d456e4d0-0a41-44a1-964e-146197b11df8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.153866 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.153752 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fcw2j\" (UniqueName: \"kubernetes.io/projected/d456e4d0-0a41-44a1-964e-146197b11df8-kube-api-access-fcw2j\") pod \"servicemesh-operator3-55f49c5f94-cf88r\" (UID: \"d456e4d0-0a41-44a1-964e-146197b11df8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.156118 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.156099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"operator-config\" (UniqueName: \"kubernetes.io/downward-api/d456e4d0-0a41-44a1-964e-146197b11df8-operator-config\") pod \"servicemesh-operator3-55f49c5f94-cf88r\" (UID: \"d456e4d0-0a41-44a1-964e-146197b11df8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.166183 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.166162 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fcw2j\" (UniqueName: \"kubernetes.io/projected/d456e4d0-0a41-44a1-964e-146197b11df8-kube-api-access-fcw2j\") pod \"servicemesh-operator3-55f49c5f94-cf88r\" (UID: \"d456e4d0-0a41-44a1-964e-146197b11df8\") " pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.255163 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.255129 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:09:56.379293 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:56.379247 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-operators/servicemesh-operator3-55f49c5f94-cf88r"] Apr 16 14:09:56.383294 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:09:56.383256 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd456e4d0_0a41_44a1_964e_146197b11df8.slice/crio-141672fea6673011512a910138eb2f4e347afdfda6a11e034c0a1be7c46fe485 WatchSource:0}: Error finding container 141672fea6673011512a910138eb2f4e347afdfda6a11e034c0a1be7c46fe485: Status 404 returned error can't find the container with id 141672fea6673011512a910138eb2f4e347afdfda6a11e034c0a1be7c46fe485 Apr 16 14:09:57.078865 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:09:57.078820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" event={"ID":"d456e4d0-0a41-44a1-964e-146197b11df8","Type":"ContainerStarted","Data":"141672fea6673011512a910138eb2f4e347afdfda6a11e034c0a1be7c46fe485"} Apr 16 14:10:00.093560 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:00.093523 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" event={"ID":"d456e4d0-0a41-44a1-964e-146197b11df8","Type":"ContainerStarted","Data":"fc03fbfeea8bd92e29d50bcb574e59f305e14af445815722e8f2dad6b65a78b1"} Apr 16 14:10:00.093925 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:00.093588 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:10:00.125420 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:00.125374 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" podStartSLOduration=2.241135067 podStartE2EDuration="5.125360268s" podCreationTimestamp="2026-04-16 14:09:55 +0000 UTC" firstStartedPulling="2026-04-16 14:09:56.385696546 +0000 UTC m=+636.579512459" lastFinishedPulling="2026-04-16 14:09:59.269921736 +0000 UTC m=+639.463737660" observedRunningTime="2026-04-16 14:10:00.122755818 +0000 UTC m=+640.316571753" watchObservedRunningTime="2026-04-16 14:10:00.125360268 +0000 UTC m=+640.319176200" Apr 16 14:10:11.099609 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:11.099579 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-operators/servicemesh-operator3-55f49c5f94-cf88r" Apr 16 14:10:24.856344 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.856310 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26"] Apr 16 14:10:24.859481 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.859460 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.862667 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862590 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"cacerts\"" Apr 16 14:10:24.862667 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862621 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 16 14:10:24.862667 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862649 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-openshift-gateway-dockercfg-rtwpl\"" Apr 16 14:10:24.862880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862667 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istiod-tls\"" Apr 16 14:10:24.862880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862650 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"istio-kubeconfig\"" Apr 16 14:10:24.862880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862654 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 16 14:10:24.862880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.862650 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"istio-ca-root-cert\"" Apr 16 14:10:24.871573 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.871551 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26"] Apr 16 14:10:24.957263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957228 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.957263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/487feac5-bfbb-4728-957d-37c930ffbabb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.957531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957344 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.957531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957362 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-td2qb\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-kube-api-access-td2qb\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.957531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.957531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957446 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:24.957531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:24.957470 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058743 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058694 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058948 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058750 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058948 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058797 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058948 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058819 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/487feac5-bfbb-4728-957d-37c930ffbabb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058948 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058860 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058948 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-td2qb\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-kube-api-access-td2qb\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.058948 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.058929 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.059677 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.059646 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.061643 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.061590 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-cacerts\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.061819 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.061732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/487feac5-bfbb-4728-957d-37c930ffbabb-local-certs\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.061819 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.061747 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-kubeconfig\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.061934 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.061917 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.066716 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.066687 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-istio-token\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.067048 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.067031 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-td2qb\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-kube-api-access-td2qb\") pod \"istiod-openshift-gateway-7cd77c7ffd-4cx26\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.168544 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.168467 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:25.296816 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:25.296783 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26"] Apr 16 14:10:25.299836 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:10:25.299806 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod487feac5_bfbb_4728_957d_37c930ffbabb.slice/crio-4c84708d0b82c692abb20b0dbcd79e071e85b22c25683f1f272dabea51a30a38 WatchSource:0}: Error finding container 4c84708d0b82c692abb20b0dbcd79e071e85b22c25683f1f272dabea51a30a38: Status 404 returned error can't find the container with id 4c84708d0b82c692abb20b0dbcd79e071e85b22c25683f1f272dabea51a30a38 Apr 16 14:10:26.171193 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:26.171160 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" event={"ID":"487feac5-bfbb-4728-957d-37c930ffbabb","Type":"ContainerStarted","Data":"4c84708d0b82c692abb20b0dbcd79e071e85b22c25683f1f272dabea51a30a38"} Apr 16 14:10:28.641407 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:28.641370 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:10:28.641674 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:28.641440 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:10:29.182972 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:29.182941 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" event={"ID":"487feac5-bfbb-4728-957d-37c930ffbabb","Type":"ContainerStarted","Data":"edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d"} Apr 16 14:10:29.183169 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:29.183048 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:29.184639 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:29.184611 2569 patch_prober.go:28] interesting pod/istiod-openshift-gateway-7cd77c7ffd-4cx26 container/discovery namespace/openshift-ingress: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body= Apr 16 14:10:29.184744 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:29.184673 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" podUID="487feac5-bfbb-4728-957d-37c930ffbabb" containerName="discovery" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 16 14:10:29.203183 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:29.203139 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" podStartSLOduration=1.863883868 podStartE2EDuration="5.203127755s" podCreationTimestamp="2026-04-16 14:10:24 +0000 UTC" firstStartedPulling="2026-04-16 14:10:25.301898294 +0000 UTC m=+665.495714207" lastFinishedPulling="2026-04-16 14:10:28.641142174 +0000 UTC m=+668.834958094" observedRunningTime="2026-04-16 14:10:29.201278321 +0000 UTC m=+669.395094247" watchObservedRunningTime="2026-04-16 14:10:29.203127755 +0000 UTC m=+669.396943689" Apr 16 14:10:30.186519 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:30.186486 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:10:51.050003 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.049969 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-c228b"] Apr 16 14:10:51.051999 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.051982 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:10:51.054980 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.054958 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 16 14:10:51.055092 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.055074 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 16 14:10:51.056707 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.056685 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-operator-dockercfg-xcdqp\"" Apr 16 14:10:51.062735 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.062713 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-c228b"] Apr 16 14:10:51.165033 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.164990 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fqcmq\" (UniqueName: \"kubernetes.io/projected/9b559eee-5fae-42f4-b45b-6df0c23fac72-kube-api-access-fqcmq\") pod \"authorino-operator-7587b89b76-c228b\" (UID: \"9b559eee-5fae-42f4-b45b-6df0c23fac72\") " pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:10:51.265463 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.265429 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fqcmq\" (UniqueName: \"kubernetes.io/projected/9b559eee-5fae-42f4-b45b-6df0c23fac72-kube-api-access-fqcmq\") pod \"authorino-operator-7587b89b76-c228b\" (UID: \"9b559eee-5fae-42f4-b45b-6df0c23fac72\") " pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:10:51.279233 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.279202 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fqcmq\" (UniqueName: \"kubernetes.io/projected/9b559eee-5fae-42f4-b45b-6df0c23fac72-kube-api-access-fqcmq\") pod \"authorino-operator-7587b89b76-c228b\" (UID: \"9b559eee-5fae-42f4-b45b-6df0c23fac72\") " pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:10:51.362446 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.362374 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:10:51.479510 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:51.479480 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-operator-7587b89b76-c228b"] Apr 16 14:10:51.482044 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:10:51.482015 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9b559eee_5fae_42f4_b45b_6df0c23fac72.slice/crio-90c03ef3669d11d7de27856f528402d4df0290aa0ecad9aa131697df5c059cad WatchSource:0}: Error finding container 90c03ef3669d11d7de27856f528402d4df0290aa0ecad9aa131697df5c059cad: Status 404 returned error can't find the container with id 90c03ef3669d11d7de27856f528402d4df0290aa0ecad9aa131697df5c059cad Apr 16 14:10:52.249966 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:52.249926 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" event={"ID":"9b559eee-5fae-42f4-b45b-6df0c23fac72","Type":"ContainerStarted","Data":"90c03ef3669d11d7de27856f528402d4df0290aa0ecad9aa131697df5c059cad"} Apr 16 14:10:56.675671 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.675590 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89"] Apr 16 14:10:56.677671 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.677586 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.680550 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.680527 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"kuadrant-operator-controller-manager-dockercfg-d8b4c\"" Apr 16 14:10:56.691562 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.691541 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89"] Apr 16 14:10:56.808617 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.808586 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d6b0287d-02fb-422c-9642-d0cabe3c48e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-4bk89\" (UID: \"d6b0287d-02fb-422c-9642-d0cabe3c48e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.808774 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.808636 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smk6x\" (UniqueName: \"kubernetes.io/projected/d6b0287d-02fb-422c-9642-d0cabe3c48e4-kube-api-access-smk6x\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-4bk89\" (UID: \"d6b0287d-02fb-422c-9642-d0cabe3c48e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.909924 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.909894 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d6b0287d-02fb-422c-9642-d0cabe3c48e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-4bk89\" (UID: \"d6b0287d-02fb-422c-9642-d0cabe3c48e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.910073 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.909939 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smk6x\" (UniqueName: \"kubernetes.io/projected/d6b0287d-02fb-422c-9642-d0cabe3c48e4-kube-api-access-smk6x\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-4bk89\" (UID: \"d6b0287d-02fb-422c-9642-d0cabe3c48e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.910246 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.910227 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"extensions-socket-volume\" (UniqueName: \"kubernetes.io/empty-dir/d6b0287d-02fb-422c-9642-d0cabe3c48e4-extensions-socket-volume\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-4bk89\" (UID: \"d6b0287d-02fb-422c-9642-d0cabe3c48e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.920675 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.920648 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smk6x\" (UniqueName: \"kubernetes.io/projected/d6b0287d-02fb-422c-9642-d0cabe3c48e4-kube-api-access-smk6x\") pod \"kuadrant-operator-controller-manager-6ddf9554fc-4bk89\" (UID: \"d6b0287d-02fb-422c-9642-d0cabe3c48e4\") " pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:56.987314 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:56.987211 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:10:57.110027 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:57.110006 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89"] Apr 16 14:10:57.112544 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:10:57.112514 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6b0287d_02fb_422c_9642_d0cabe3c48e4.slice/crio-2d06eaa649f4dd32f4e53c70cf1817ed2271ceeb7d54ac880a27ba127c1c7cca WatchSource:0}: Error finding container 2d06eaa649f4dd32f4e53c70cf1817ed2271ceeb7d54ac880a27ba127c1c7cca: Status 404 returned error can't find the container with id 2d06eaa649f4dd32f4e53c70cf1817ed2271ceeb7d54ac880a27ba127c1c7cca Apr 16 14:10:57.266299 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:57.266243 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" event={"ID":"d6b0287d-02fb-422c-9642-d0cabe3c48e4","Type":"ContainerStarted","Data":"2d06eaa649f4dd32f4e53c70cf1817ed2271ceeb7d54ac880a27ba127c1c7cca"} Apr 16 14:10:57.267484 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:57.267459 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" event={"ID":"9b559eee-5fae-42f4-b45b-6df0c23fac72","Type":"ContainerStarted","Data":"a550c703f32b1e0acf16f172e295d3a82c0453925a061bee6cfc46d1903e20d0"} Apr 16 14:10:57.267640 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:57.267626 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:10:57.290415 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:10:57.290362 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" podStartSLOduration=1.513218741 podStartE2EDuration="6.29035157s" podCreationTimestamp="2026-04-16 14:10:51 +0000 UTC" firstStartedPulling="2026-04-16 14:10:51.484152138 +0000 UTC m=+691.677968066" lastFinishedPulling="2026-04-16 14:10:56.261284981 +0000 UTC m=+696.455100895" observedRunningTime="2026-04-16 14:10:57.28825506 +0000 UTC m=+697.482070996" watchObservedRunningTime="2026-04-16 14:10:57.29035157 +0000 UTC m=+697.484167504" Apr 16 14:11:01.282560 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:01.282530 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" event={"ID":"d6b0287d-02fb-422c-9642-d0cabe3c48e4","Type":"ContainerStarted","Data":"278a9da4193743ff4dee168288ad6bd11cbdaddb215e85ec43dbf6463d07ea90"} Apr 16 14:11:01.282945 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:01.282751 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:11:01.305968 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:01.305925 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" podStartSLOduration=1.445746891 podStartE2EDuration="5.305912231s" podCreationTimestamp="2026-04-16 14:10:56 +0000 UTC" firstStartedPulling="2026-04-16 14:10:57.114870728 +0000 UTC m=+697.308686641" lastFinishedPulling="2026-04-16 14:11:00.975036064 +0000 UTC m=+701.168851981" observedRunningTime="2026-04-16 14:11:01.303521526 +0000 UTC m=+701.497337469" watchObservedRunningTime="2026-04-16 14:11:01.305912231 +0000 UTC m=+701.499728166" Apr 16 14:11:08.273789 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:08.273760 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/authorino-operator-7587b89b76-c228b" Apr 16 14:11:12.288084 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:12.288009 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/kuadrant-operator-controller-manager-6ddf9554fc-4bk89" Apr 16 14:11:45.033043 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.033011 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:11:45.039177 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.039149 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.041924 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.041873 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-9bhj5\"" Apr 16 14:11:45.042301 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.042281 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 16 14:11:45.044505 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.044482 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:11:45.076845 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.076819 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxnb8\" (UniqueName: \"kubernetes.io/projected/026c0c78-5d4a-404b-bb51-29302f80aa98-kube-api-access-cxnb8\") pod \"limitador-limitador-64c8f475fb-9zhpt\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.076984 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.076900 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/026c0c78-5d4a-404b-bb51-29302f80aa98-config-file\") pod \"limitador-limitador-64c8f475fb-9zhpt\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.133241 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.133212 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:11:45.177601 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.177568 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/026c0c78-5d4a-404b-bb51-29302f80aa98-config-file\") pod \"limitador-limitador-64c8f475fb-9zhpt\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.177768 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.177625 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxnb8\" (UniqueName: \"kubernetes.io/projected/026c0c78-5d4a-404b-bb51-29302f80aa98-kube-api-access-cxnb8\") pod \"limitador-limitador-64c8f475fb-9zhpt\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.178173 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.178154 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/026c0c78-5d4a-404b-bb51-29302f80aa98-config-file\") pod \"limitador-limitador-64c8f475fb-9zhpt\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.185498 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.185468 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxnb8\" (UniqueName: \"kubernetes.io/projected/026c0c78-5d4a-404b-bb51-29302f80aa98-kube-api-access-cxnb8\") pod \"limitador-limitador-64c8f475fb-9zhpt\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.350220 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.350130 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:45.474247 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:45.474213 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:11:45.477390 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:11:45.477356 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod026c0c78_5d4a_404b_bb51_29302f80aa98.slice/crio-49d89afbeb358f6b29ec09e4e2877ac7e591c643fe137a173ad8e627539c8a5d WatchSource:0}: Error finding container 49d89afbeb358f6b29ec09e4e2877ac7e591c643fe137a173ad8e627539c8a5d: Status 404 returned error can't find the container with id 49d89afbeb358f6b29ec09e4e2877ac7e591c643fe137a173ad8e627539c8a5d Apr 16 14:11:46.420916 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:46.420876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" event={"ID":"026c0c78-5d4a-404b-bb51-29302f80aa98","Type":"ContainerStarted","Data":"49d89afbeb358f6b29ec09e4e2877ac7e591c643fe137a173ad8e627539c8a5d"} Apr 16 14:11:49.431834 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:49.431753 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" event={"ID":"026c0c78-5d4a-404b-bb51-29302f80aa98","Type":"ContainerStarted","Data":"581d6528a8275eeccc9e50e7f63b84303ba20fb4ba1ee6b08478edc4667edafd"} Apr 16 14:11:49.431834 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:49.431814 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:11:49.451910 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:11:49.451867 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" podStartSLOduration=0.764026502 podStartE2EDuration="4.451852556s" podCreationTimestamp="2026-04-16 14:11:45 +0000 UTC" firstStartedPulling="2026-04-16 14:11:45.479551125 +0000 UTC m=+745.673367037" lastFinishedPulling="2026-04-16 14:11:49.167377178 +0000 UTC m=+749.361193091" observedRunningTime="2026-04-16 14:11:49.450141451 +0000 UTC m=+749.643957386" watchObservedRunningTime="2026-04-16 14:11:49.451852556 +0000 UTC m=+749.645668492" Apr 16 14:12:00.434989 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:00.434962 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:12:01.770797 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:01.770764 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:12:01.771215 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:01.770958 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" podUID="026c0c78-5d4a-404b-bb51-29302f80aa98" containerName="limitador" containerID="cri-o://581d6528a8275eeccc9e50e7f63b84303ba20fb4ba1ee6b08478edc4667edafd" gracePeriod=30 Apr 16 14:12:02.474119 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.474029 2569 generic.go:358] "Generic (PLEG): container finished" podID="026c0c78-5d4a-404b-bb51-29302f80aa98" containerID="581d6528a8275eeccc9e50e7f63b84303ba20fb4ba1ee6b08478edc4667edafd" exitCode=0 Apr 16 14:12:02.474119 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.474098 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" event={"ID":"026c0c78-5d4a-404b-bb51-29302f80aa98","Type":"ContainerDied","Data":"581d6528a8275eeccc9e50e7f63b84303ba20fb4ba1ee6b08478edc4667edafd"} Apr 16 14:12:02.711094 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.711074 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:12:02.825553 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.825527 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/026c0c78-5d4a-404b-bb51-29302f80aa98-config-file\") pod \"026c0c78-5d4a-404b-bb51-29302f80aa98\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " Apr 16 14:12:02.825887 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.825608 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cxnb8\" (UniqueName: \"kubernetes.io/projected/026c0c78-5d4a-404b-bb51-29302f80aa98-kube-api-access-cxnb8\") pod \"026c0c78-5d4a-404b-bb51-29302f80aa98\" (UID: \"026c0c78-5d4a-404b-bb51-29302f80aa98\") " Apr 16 14:12:02.825887 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.825828 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/026c0c78-5d4a-404b-bb51-29302f80aa98-config-file" (OuterVolumeSpecName: "config-file") pod "026c0c78-5d4a-404b-bb51-29302f80aa98" (UID: "026c0c78-5d4a-404b-bb51-29302f80aa98"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:12:02.827833 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.827813 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/026c0c78-5d4a-404b-bb51-29302f80aa98-kube-api-access-cxnb8" (OuterVolumeSpecName: "kube-api-access-cxnb8") pod "026c0c78-5d4a-404b-bb51-29302f80aa98" (UID: "026c0c78-5d4a-404b-bb51-29302f80aa98"). InnerVolumeSpecName "kube-api-access-cxnb8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:12:02.926895 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.926857 2569 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/026c0c78-5d4a-404b-bb51-29302f80aa98-config-file\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:02.926895 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:02.926892 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cxnb8\" (UniqueName: \"kubernetes.io/projected/026c0c78-5d4a-404b-bb51-29302f80aa98-kube-api-access-cxnb8\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:03.478524 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:03.478490 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" event={"ID":"026c0c78-5d4a-404b-bb51-29302f80aa98","Type":"ContainerDied","Data":"49d89afbeb358f6b29ec09e4e2877ac7e591c643fe137a173ad8e627539c8a5d"} Apr 16 14:12:03.478524 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:03.478530 2569 scope.go:117] "RemoveContainer" containerID="581d6528a8275eeccc9e50e7f63b84303ba20fb4ba1ee6b08478edc4667edafd" Apr 16 14:12:03.478767 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:03.478531 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-9zhpt" Apr 16 14:12:03.499740 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:03.499717 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:12:03.502822 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:03.502803 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-9zhpt"] Apr 16 14:12:04.346863 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:04.346831 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="026c0c78-5d4a-404b-bb51-29302f80aa98" path="/var/lib/kubelet/pods/026c0c78-5d4a-404b-bb51-29302f80aa98/volumes" Apr 16 14:12:20.749197 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.749111 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64"] Apr 16 14:12:20.749693 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.749551 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="026c0c78-5d4a-404b-bb51-29302f80aa98" containerName="limitador" Apr 16 14:12:20.749693 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.749568 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="026c0c78-5d4a-404b-bb51-29302f80aa98" containerName="limitador" Apr 16 14:12:20.749693 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.749632 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="026c0c78-5d4a-404b-bb51-29302f80aa98" containerName="limitador" Apr 16 14:12:20.752138 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.752118 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.762452 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.762430 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64"] Apr 16 14:12:20.860536 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860500 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpjh6\" (UniqueName: \"kubernetes.io/projected/a32a6135-bf34-4999-bce9-2bf65c6ec74a-kube-api-access-cpjh6\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.860536 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a32a6135-bf34-4999-bce9-2bf65c6ec74a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.860756 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860558 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.860756 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860658 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.860756 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860695 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.860858 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860785 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.860858 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.860815 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.961918 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.961864 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962109 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.961937 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962109 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.961999 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpjh6\" (UniqueName: \"kubernetes.io/projected/a32a6135-bf34-4999-bce9-2bf65c6ec74a-kube-api-access-cpjh6\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962109 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.962035 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a32a6135-bf34-4999-bce9-2bf65c6ec74a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962109 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.962058 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962357 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.962124 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962357 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.962166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.962959 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.962934 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-csr-ca-configmap\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.964969 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.964937 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/a32a6135-bf34-4999-bce9-2bf65c6ec74a-local-certs\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.965137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.965109 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-cacerts\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.965137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.965130 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-csr-dns-cert\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.965305 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.965181 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-kubeconfig\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.975303 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.975251 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/a32a6135-bf34-4999-bce9-2bf65c6ec74a-istio-token\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:20.975816 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:20.975770 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpjh6\" (UniqueName: \"kubernetes.io/projected/a32a6135-bf34-4999-bce9-2bf65c6ec74a-kube-api-access-cpjh6\") pod \"istiod-openshift-gateway-55ff986f96-tpb64\" (UID: \"a32a6135-bf34-4999-bce9-2bf65c6ec74a\") " pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:21.061878 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.061435 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:21.195212 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.195186 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64"] Apr 16 14:12:21.198235 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:12:21.198202 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda32a6135_bf34_4999_bce9_2bf65c6ec74a.slice/crio-d575b24e18312e5a58cb3c6b08310964bd8021e152e5463595342a0cb3706041 WatchSource:0}: Error finding container d575b24e18312e5a58cb3c6b08310964bd8021e152e5463595342a0cb3706041: Status 404 returned error can't find the container with id d575b24e18312e5a58cb3c6b08310964bd8021e152e5463595342a0cb3706041 Apr 16 14:12:21.200528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.200490 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:12:21.200637 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.200555 2569 kubelet_resources.go:45] "Allocatable" allocatable={"cpu":"7500m","ephemeral-storage":"114345831029","hugepages-1Gi":"0","hugepages-2Mi":"0","memory":"31236224Ki","pods":"250"} Apr 16 14:12:21.534689 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.534654 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" event={"ID":"a32a6135-bf34-4999-bce9-2bf65c6ec74a","Type":"ContainerStarted","Data":"62a92689608c1673c422e1b9e9c836cf9e15101b55b28139bc6dbf1860c6adc9"} Apr 16 14:12:21.534689 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.534693 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" event={"ID":"a32a6135-bf34-4999-bce9-2bf65c6ec74a","Type":"ContainerStarted","Data":"d575b24e18312e5a58cb3c6b08310964bd8021e152e5463595342a0cb3706041"} Apr 16 14:12:21.534939 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.534801 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:21.559449 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:21.559409 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" podStartSLOduration=1.559395759 podStartE2EDuration="1.559395759s" podCreationTimestamp="2026-04-16 14:12:20 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:12:21.559021215 +0000 UTC m=+781.752837152" watchObservedRunningTime="2026-04-16 14:12:21.559395759 +0000 UTC m=+781.753211693" Apr 16 14:12:22.539617 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.539581 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/istiod-openshift-gateway-55ff986f96-tpb64" Apr 16 14:12:22.602607 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.602579 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26"] Apr 16 14:12:22.602823 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.602801 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" podUID="487feac5-bfbb-4728-957d-37c930ffbabb" containerName="discovery" containerID="cri-o://edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d" gracePeriod=30 Apr 16 14:12:22.849569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.849546 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:12:22.978427 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978382 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-ca-configmap\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978427 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978424 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-istio-token\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978468 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-cacerts\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978495 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-kubeconfig\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978510 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-dns-cert\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978543 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/487feac5-bfbb-4728-957d-37c930ffbabb-local-certs\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978666 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978568 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-td2qb\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-kube-api-access-td2qb\") pod \"487feac5-bfbb-4728-957d-37c930ffbabb\" (UID: \"487feac5-bfbb-4728-957d-37c930ffbabb\") " Apr 16 14:12:22.978906 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.978882 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-ca-configmap" (OuterVolumeSpecName: "istio-csr-ca-configmap") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "istio-csr-ca-configmap". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 16 14:12:22.981073 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.981041 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-cacerts" (OuterVolumeSpecName: "cacerts") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "cacerts". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:12:22.981300 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.981249 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-dns-cert" (OuterVolumeSpecName: "istio-csr-dns-cert") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "istio-csr-dns-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:12:22.981413 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.981339 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-kubeconfig" (OuterVolumeSpecName: "istio-kubeconfig") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "istio-kubeconfig". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:12:22.981413 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.981361 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-istio-token" (OuterVolumeSpecName: "istio-token") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "istio-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:12:22.981413 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.981376 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/487feac5-bfbb-4728-957d-37c930ffbabb-local-certs" (OuterVolumeSpecName: "local-certs") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "local-certs". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:12:22.981413 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:22.981373 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-kube-api-access-td2qb" (OuterVolumeSpecName: "kube-api-access-td2qb") pod "487feac5-bfbb-4728-957d-37c930ffbabb" (UID: "487feac5-bfbb-4728-957d-37c930ffbabb"). InnerVolumeSpecName "kube-api-access-td2qb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:12:23.079726 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079699 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-ca-configmap\" (UniqueName: \"kubernetes.io/configmap/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-ca-configmap\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.079726 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079722 2569 reconciler_common.go:299] "Volume detached for volume \"istio-token\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-istio-token\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.079726 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079732 2569 reconciler_common.go:299] "Volume detached for volume \"cacerts\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-cacerts\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.079943 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079741 2569 reconciler_common.go:299] "Volume detached for volume \"istio-kubeconfig\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-kubeconfig\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.079943 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079750 2569 reconciler_common.go:299] "Volume detached for volume \"istio-csr-dns-cert\" (UniqueName: \"kubernetes.io/secret/487feac5-bfbb-4728-957d-37c930ffbabb-istio-csr-dns-cert\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.079943 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079759 2569 reconciler_common.go:299] "Volume detached for volume \"local-certs\" (UniqueName: \"kubernetes.io/empty-dir/487feac5-bfbb-4728-957d-37c930ffbabb-local-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.079943 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.079769 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-td2qb\" (UniqueName: \"kubernetes.io/projected/487feac5-bfbb-4728-957d-37c930ffbabb-kube-api-access-td2qb\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:12:23.541954 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.541916 2569 generic.go:358] "Generic (PLEG): container finished" podID="487feac5-bfbb-4728-957d-37c930ffbabb" containerID="edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d" exitCode=0 Apr 16 14:12:23.542457 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.542008 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" Apr 16 14:12:23.542457 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.542006 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" event={"ID":"487feac5-bfbb-4728-957d-37c930ffbabb","Type":"ContainerDied","Data":"edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d"} Apr 16 14:12:23.542457 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.542056 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26" event={"ID":"487feac5-bfbb-4728-957d-37c930ffbabb","Type":"ContainerDied","Data":"4c84708d0b82c692abb20b0dbcd79e071e85b22c25683f1f272dabea51a30a38"} Apr 16 14:12:23.542457 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.542077 2569 scope.go:117] "RemoveContainer" containerID="edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d" Apr 16 14:12:23.551223 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.551202 2569 scope.go:117] "RemoveContainer" containerID="edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d" Apr 16 14:12:23.551520 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:12:23.551496 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d\": container with ID starting with edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d not found: ID does not exist" containerID="edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d" Apr 16 14:12:23.551623 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.551531 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d"} err="failed to get container status \"edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d\": rpc error: code = NotFound desc = could not find container \"edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d\": container with ID starting with edaf30cc8aabed6a1ae5877caaac5ca04de09e08abf0f93e5ce17e3f6c7d242d not found: ID does not exist" Apr 16 14:12:23.567436 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.567406 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26"] Apr 16 14:12:23.572589 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:23.572564 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-ingress/istiod-openshift-gateway-7cd77c7ffd-4cx26"] Apr 16 14:12:24.347677 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:24.347644 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="487feac5-bfbb-4728-957d-37c930ffbabb" path="/var/lib/kubelet/pods/487feac5-bfbb-4728-957d-37c930ffbabb/volumes" Apr 16 14:12:29.487639 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.487609 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-wz5cf"] Apr 16 14:12:29.487989 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.487872 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="487feac5-bfbb-4728-957d-37c930ffbabb" containerName="discovery" Apr 16 14:12:29.487989 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.487882 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="487feac5-bfbb-4728-957d-37c930ffbabb" containerName="discovery" Apr 16 14:12:29.487989 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.487927 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="487feac5-bfbb-4728-957d-37c930ffbabb" containerName="discovery" Apr 16 14:12:29.495605 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.495578 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.498401 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.498376 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 16 14:12:29.499472 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.499429 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-xhk4z\"" Apr 16 14:12:29.499472 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.499445 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 16 14:12:29.499657 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.499450 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 16 14:12:29.499866 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.499850 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-wz5cf"] Apr 16 14:12:29.503056 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.502979 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85"] Apr 16 14:12:29.506276 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.506246 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:29.508807 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.508790 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 16 14:12:29.509076 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.509060 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-nb7bt\"" Apr 16 14:12:29.518653 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.518633 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85"] Apr 16 14:12:29.627541 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.627510 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-flb7h\" (UniqueName: \"kubernetes.io/projected/795db089-640f-44aa-a39d-7a6761e2d7d1-kube-api-access-flb7h\") pod \"kserve-controller-manager-75d667c7c4-wz5cf\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.627541 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.627542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:29.627722 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.627629 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/795db089-640f-44aa-a39d-7a6761e2d7d1-cert\") pod \"kserve-controller-manager-75d667c7c4-wz5cf\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.627722 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.627676 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4k94x\" (UniqueName: \"kubernetes.io/projected/0aa370be-e90a-4ef6-b024-85ba8248f0b6-kube-api-access-4k94x\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:29.728222 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.728178 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/795db089-640f-44aa-a39d-7a6761e2d7d1-cert\") pod \"kserve-controller-manager-75d667c7c4-wz5cf\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.728426 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.728240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4k94x\" (UniqueName: \"kubernetes.io/projected/0aa370be-e90a-4ef6-b024-85ba8248f0b6-kube-api-access-4k94x\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:29.728426 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.728298 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-flb7h\" (UniqueName: \"kubernetes.io/projected/795db089-640f-44aa-a39d-7a6761e2d7d1-kube-api-access-flb7h\") pod \"kserve-controller-manager-75d667c7c4-wz5cf\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.728426 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.728323 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:29.728426 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:12:29.728422 2569 secret.go:189] Couldn't get secret kserve/llmisvc-webhook-server-cert: secret "llmisvc-webhook-server-cert" not found Apr 16 14:12:29.728626 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:12:29.728558 2569 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert podName:0aa370be-e90a-4ef6-b024-85ba8248f0b6 nodeName:}" failed. No retries permitted until 2026-04-16 14:12:30.228468743 +0000 UTC m=+790.422284659 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert") pod "llmisvc-controller-manager-7ccc8fbdb4-t7z85" (UID: "0aa370be-e90a-4ef6-b024-85ba8248f0b6") : secret "llmisvc-webhook-server-cert" not found Apr 16 14:12:29.730762 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.730740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/795db089-640f-44aa-a39d-7a6761e2d7d1-cert\") pod \"kserve-controller-manager-75d667c7c4-wz5cf\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.737638 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.737612 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4k94x\" (UniqueName: \"kubernetes.io/projected/0aa370be-e90a-4ef6-b024-85ba8248f0b6-kube-api-access-4k94x\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:29.737769 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.737654 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-flb7h\" (UniqueName: \"kubernetes.io/projected/795db089-640f-44aa-a39d-7a6761e2d7d1-kube-api-access-flb7h\") pod \"kserve-controller-manager-75d667c7c4-wz5cf\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.806317 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.806290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:29.927995 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:29.927959 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-wz5cf"] Apr 16 14:12:29.930947 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:12:29.930926 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod795db089_640f_44aa_a39d_7a6761e2d7d1.slice/crio-61ab4d1bb65496c2c44818f164da8ced9689f5fac1ff2b1c1b9f01ebcda78a4b WatchSource:0}: Error finding container 61ab4d1bb65496c2c44818f164da8ced9689f5fac1ff2b1c1b9f01ebcda78a4b: Status 404 returned error can't find the container with id 61ab4d1bb65496c2c44818f164da8ced9689f5fac1ff2b1c1b9f01ebcda78a4b Apr 16 14:12:30.232569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:30.232532 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:30.234877 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:30.234849 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert\") pod \"llmisvc-controller-manager-7ccc8fbdb4-t7z85\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:30.415424 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:30.415391 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:30.552169 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:30.552135 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85"] Apr 16 14:12:30.555780 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:12:30.555747 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0aa370be_e90a_4ef6_b024_85ba8248f0b6.slice/crio-80ed9b23e3daf01f74470ea401c48297becd8d85874c2f96ffb037eb5da24ddc WatchSource:0}: Error finding container 80ed9b23e3daf01f74470ea401c48297becd8d85874c2f96ffb037eb5da24ddc: Status 404 returned error can't find the container with id 80ed9b23e3daf01f74470ea401c48297becd8d85874c2f96ffb037eb5da24ddc Apr 16 14:12:30.565562 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:30.565536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" event={"ID":"0aa370be-e90a-4ef6-b024-85ba8248f0b6","Type":"ContainerStarted","Data":"80ed9b23e3daf01f74470ea401c48297becd8d85874c2f96ffb037eb5da24ddc"} Apr 16 14:12:30.566603 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:30.566576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" event={"ID":"795db089-640f-44aa-a39d-7a6761e2d7d1","Type":"ContainerStarted","Data":"61ab4d1bb65496c2c44818f164da8ced9689f5fac1ff2b1c1b9f01ebcda78a4b"} Apr 16 14:12:33.578239 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:33.578201 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" event={"ID":"795db089-640f-44aa-a39d-7a6761e2d7d1","Type":"ContainerStarted","Data":"9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41"} Apr 16 14:12:33.578682 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:33.578432 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:12:33.595028 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:33.594982 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" podStartSLOduration=1.312494632 podStartE2EDuration="4.594969713s" podCreationTimestamp="2026-04-16 14:12:29 +0000 UTC" firstStartedPulling="2026-04-16 14:12:29.932110276 +0000 UTC m=+790.125926189" lastFinishedPulling="2026-04-16 14:12:33.21458535 +0000 UTC m=+793.408401270" observedRunningTime="2026-04-16 14:12:33.594124442 +0000 UTC m=+793.787940387" watchObservedRunningTime="2026-04-16 14:12:33.594969713 +0000 UTC m=+793.788785647" Apr 16 14:12:35.585841 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:35.585805 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" event={"ID":"0aa370be-e90a-4ef6-b024-85ba8248f0b6","Type":"ContainerStarted","Data":"a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9"} Apr 16 14:12:35.586315 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:35.585917 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:12:35.602330 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:12:35.602292 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" podStartSLOduration=2.090313436 podStartE2EDuration="6.602279943s" podCreationTimestamp="2026-04-16 14:12:29 +0000 UTC" firstStartedPulling="2026-04-16 14:12:30.557318388 +0000 UTC m=+790.751134304" lastFinishedPulling="2026-04-16 14:12:35.069284894 +0000 UTC m=+795.263100811" observedRunningTime="2026-04-16 14:12:35.600536615 +0000 UTC m=+795.794352549" watchObservedRunningTime="2026-04-16 14:12:35.602279943 +0000 UTC m=+795.796095869" Apr 16 14:13:04.586566 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:04.586534 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:13:06.591943 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:06.591914 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:13:07.812494 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.812460 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-wz5cf"] Apr 16 14:13:07.812868 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.812688 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" podUID="795db089-640f-44aa-a39d-7a6761e2d7d1" containerName="manager" containerID="cri-o://9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41" gracePeriod=10 Apr 16 14:13:07.832231 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.832209 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-hrxgl"] Apr 16 14:13:07.835483 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.835465 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:07.844785 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.844766 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-hrxgl"] Apr 16 14:13:07.911621 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.911595 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e804e63-6427-4ecd-ad17-f5b39340b162-cert\") pod \"kserve-controller-manager-75d667c7c4-hrxgl\" (UID: \"9e804e63-6427-4ecd-ad17-f5b39340b162\") " pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:07.911728 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:07.911656 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-45fkr\" (UniqueName: \"kubernetes.io/projected/9e804e63-6427-4ecd-ad17-f5b39340b162-kube-api-access-45fkr\") pod \"kserve-controller-manager-75d667c7c4-hrxgl\" (UID: \"9e804e63-6427-4ecd-ad17-f5b39340b162\") " pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:08.012914 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.012880 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e804e63-6427-4ecd-ad17-f5b39340b162-cert\") pod \"kserve-controller-manager-75d667c7c4-hrxgl\" (UID: \"9e804e63-6427-4ecd-ad17-f5b39340b162\") " pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:08.013081 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.012950 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-45fkr\" (UniqueName: \"kubernetes.io/projected/9e804e63-6427-4ecd-ad17-f5b39340b162-kube-api-access-45fkr\") pod \"kserve-controller-manager-75d667c7c4-hrxgl\" (UID: \"9e804e63-6427-4ecd-ad17-f5b39340b162\") " pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:08.015225 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.015199 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/9e804e63-6427-4ecd-ad17-f5b39340b162-cert\") pod \"kserve-controller-manager-75d667c7c4-hrxgl\" (UID: \"9e804e63-6427-4ecd-ad17-f5b39340b162\") " pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:08.021822 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.021786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-45fkr\" (UniqueName: \"kubernetes.io/projected/9e804e63-6427-4ecd-ad17-f5b39340b162-kube-api-access-45fkr\") pod \"kserve-controller-manager-75d667c7c4-hrxgl\" (UID: \"9e804e63-6427-4ecd-ad17-f5b39340b162\") " pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:08.052108 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.052083 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:13:08.113565 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.113487 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/795db089-640f-44aa-a39d-7a6761e2d7d1-cert\") pod \"795db089-640f-44aa-a39d-7a6761e2d7d1\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " Apr 16 14:13:08.113565 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.113524 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-flb7h\" (UniqueName: \"kubernetes.io/projected/795db089-640f-44aa-a39d-7a6761e2d7d1-kube-api-access-flb7h\") pod \"795db089-640f-44aa-a39d-7a6761e2d7d1\" (UID: \"795db089-640f-44aa-a39d-7a6761e2d7d1\") " Apr 16 14:13:08.115708 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.115671 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/795db089-640f-44aa-a39d-7a6761e2d7d1-kube-api-access-flb7h" (OuterVolumeSpecName: "kube-api-access-flb7h") pod "795db089-640f-44aa-a39d-7a6761e2d7d1" (UID: "795db089-640f-44aa-a39d-7a6761e2d7d1"). InnerVolumeSpecName "kube-api-access-flb7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:13:08.115811 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.115701 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/795db089-640f-44aa-a39d-7a6761e2d7d1-cert" (OuterVolumeSpecName: "cert") pod "795db089-640f-44aa-a39d-7a6761e2d7d1" (UID: "795db089-640f-44aa-a39d-7a6761e2d7d1"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:13:08.194325 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.194290 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:08.214500 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.214474 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/795db089-640f-44aa-a39d-7a6761e2d7d1-cert\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:13:08.214608 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.214504 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-flb7h\" (UniqueName: \"kubernetes.io/projected/795db089-640f-44aa-a39d-7a6761e2d7d1-kube-api-access-flb7h\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:13:08.311174 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.311145 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-hrxgl"] Apr 16 14:13:08.314281 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:13:08.314244 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9e804e63_6427_4ecd_ad17_f5b39340b162.slice/crio-6b9e96aa204b0b529d9ad2023d3363f83c3c74c4bb1ff9612624d16dd42bf3e8 WatchSource:0}: Error finding container 6b9e96aa204b0b529d9ad2023d3363f83c3c74c4bb1ff9612624d16dd42bf3e8: Status 404 returned error can't find the container with id 6b9e96aa204b0b529d9ad2023d3363f83c3c74c4bb1ff9612624d16dd42bf3e8 Apr 16 14:13:08.315454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.315436 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:13:08.683765 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.683728 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" event={"ID":"9e804e63-6427-4ecd-ad17-f5b39340b162","Type":"ContainerStarted","Data":"6b9e96aa204b0b529d9ad2023d3363f83c3c74c4bb1ff9612624d16dd42bf3e8"} Apr 16 14:13:08.684833 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.684801 2569 generic.go:358] "Generic (PLEG): container finished" podID="795db089-640f-44aa-a39d-7a6761e2d7d1" containerID="9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41" exitCode=0 Apr 16 14:13:08.684970 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.684869 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" Apr 16 14:13:08.684970 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.684871 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" event={"ID":"795db089-640f-44aa-a39d-7a6761e2d7d1","Type":"ContainerDied","Data":"9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41"} Apr 16 14:13:08.685081 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.684972 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-wz5cf" event={"ID":"795db089-640f-44aa-a39d-7a6761e2d7d1","Type":"ContainerDied","Data":"61ab4d1bb65496c2c44818f164da8ced9689f5fac1ff2b1c1b9f01ebcda78a4b"} Apr 16 14:13:08.685081 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.684988 2569 scope.go:117] "RemoveContainer" containerID="9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41" Apr 16 14:13:08.692695 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.692660 2569 scope.go:117] "RemoveContainer" containerID="9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41" Apr 16 14:13:08.692953 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:13:08.692933 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41\": container with ID starting with 9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41 not found: ID does not exist" containerID="9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41" Apr 16 14:13:08.693022 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.692960 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41"} err="failed to get container status \"9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41\": rpc error: code = NotFound desc = could not find container \"9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41\": container with ID starting with 9c9a55450ad40eaec28be0549a86541d6941522e65b52d0132dafba61178ea41 not found: ID does not exist" Apr 16 14:13:08.702486 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.702464 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-wz5cf"] Apr 16 14:13:08.707726 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:08.707705 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-75d667c7c4-wz5cf"] Apr 16 14:13:09.689349 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:09.689316 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" event={"ID":"9e804e63-6427-4ecd-ad17-f5b39340b162","Type":"ContainerStarted","Data":"22fad6f700ece5d6823b8148ff66f505ff81c85aaf8c7d46d9da64d3b3a17b5d"} Apr 16 14:13:09.689789 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:09.689389 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:09.709198 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:09.709123 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" podStartSLOduration=2.175223458 podStartE2EDuration="2.709094995s" podCreationTimestamp="2026-04-16 14:13:07 +0000 UTC" firstStartedPulling="2026-04-16 14:13:08.315560854 +0000 UTC m=+828.509376766" lastFinishedPulling="2026-04-16 14:13:08.849432373 +0000 UTC m=+829.043248303" observedRunningTime="2026-04-16 14:13:09.707863596 +0000 UTC m=+829.901679531" watchObservedRunningTime="2026-04-16 14:13:09.709094995 +0000 UTC m=+829.902910932" Apr 16 14:13:10.348695 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:10.348658 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="795db089-640f-44aa-a39d-7a6761e2d7d1" path="/var/lib/kubelet/pods/795db089-640f-44aa-a39d-7a6761e2d7d1/volumes" Apr 16 14:13:40.698148 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:40.698118 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-75d667c7c4-hrxgl" Apr 16 14:13:41.683317 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.683283 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-dw5c9"] Apr 16 14:13:41.683578 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.683567 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="795db089-640f-44aa-a39d-7a6761e2d7d1" containerName="manager" Apr 16 14:13:41.683578 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.683579 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="795db089-640f-44aa-a39d-7a6761e2d7d1" containerName="manager" Apr 16 14:13:41.683660 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.683623 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="795db089-640f-44aa-a39d-7a6761e2d7d1" containerName="manager" Apr 16 14:13:41.688026 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.688000 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.690825 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.690807 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-t2brt\"" Apr 16 14:13:41.691010 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.690989 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 16 14:13:41.698216 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.698196 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-z4b9p"] Apr 16 14:13:41.701257 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.701241 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dw5c9"] Apr 16 14:13:41.701374 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.701366 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.703971 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.703950 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 16 14:13:41.704069 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.704023 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-xnpv2\"" Apr 16 14:13:41.712731 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.712706 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-z4b9p"] Apr 16 14:13:41.777375 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.777347 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pc55c\" (UniqueName: \"kubernetes.io/projected/c3596ae7-10db-4eae-81bb-18f75dccd1fc-kube-api-access-pc55c\") pod \"model-serving-api-86f7b4b499-dw5c9\" (UID: \"c3596ae7-10db-4eae-81bb-18f75dccd1fc\") " pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.777502 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.777401 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3596ae7-10db-4eae-81bb-18f75dccd1fc-tls-certs\") pod \"model-serving-api-86f7b4b499-dw5c9\" (UID: \"c3596ae7-10db-4eae-81bb-18f75dccd1fc\") " pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.878153 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.878120 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46cf7836-eb7e-4810-a85d-22f2e3b83333-cert\") pod \"odh-model-controller-696fc77849-z4b9p\" (UID: \"46cf7836-eb7e-4810-a85d-22f2e3b83333\") " pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.878350 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.878167 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pc55c\" (UniqueName: \"kubernetes.io/projected/c3596ae7-10db-4eae-81bb-18f75dccd1fc-kube-api-access-pc55c\") pod \"model-serving-api-86f7b4b499-dw5c9\" (UID: \"c3596ae7-10db-4eae-81bb-18f75dccd1fc\") " pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.878350 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.878220 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r8r2n\" (UniqueName: \"kubernetes.io/projected/46cf7836-eb7e-4810-a85d-22f2e3b83333-kube-api-access-r8r2n\") pod \"odh-model-controller-696fc77849-z4b9p\" (UID: \"46cf7836-eb7e-4810-a85d-22f2e3b83333\") " pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.878350 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.878240 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3596ae7-10db-4eae-81bb-18f75dccd1fc-tls-certs\") pod \"model-serving-api-86f7b4b499-dw5c9\" (UID: \"c3596ae7-10db-4eae-81bb-18f75dccd1fc\") " pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.880800 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.880775 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c3596ae7-10db-4eae-81bb-18f75dccd1fc-tls-certs\") pod \"model-serving-api-86f7b4b499-dw5c9\" (UID: \"c3596ae7-10db-4eae-81bb-18f75dccd1fc\") " pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.887856 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.887834 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pc55c\" (UniqueName: \"kubernetes.io/projected/c3596ae7-10db-4eae-81bb-18f75dccd1fc-kube-api-access-pc55c\") pod \"model-serving-api-86f7b4b499-dw5c9\" (UID: \"c3596ae7-10db-4eae-81bb-18f75dccd1fc\") " pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:41.978677 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.978616 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r8r2n\" (UniqueName: \"kubernetes.io/projected/46cf7836-eb7e-4810-a85d-22f2e3b83333-kube-api-access-r8r2n\") pod \"odh-model-controller-696fc77849-z4b9p\" (UID: \"46cf7836-eb7e-4810-a85d-22f2e3b83333\") " pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.978677 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.978660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46cf7836-eb7e-4810-a85d-22f2e3b83333-cert\") pod \"odh-model-controller-696fc77849-z4b9p\" (UID: \"46cf7836-eb7e-4810-a85d-22f2e3b83333\") " pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.980885 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.980862 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/46cf7836-eb7e-4810-a85d-22f2e3b83333-cert\") pod \"odh-model-controller-696fc77849-z4b9p\" (UID: \"46cf7836-eb7e-4810-a85d-22f2e3b83333\") " pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.988760 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.988740 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r8r2n\" (UniqueName: \"kubernetes.io/projected/46cf7836-eb7e-4810-a85d-22f2e3b83333-kube-api-access-r8r2n\") pod \"odh-model-controller-696fc77849-z4b9p\" (UID: \"46cf7836-eb7e-4810-a85d-22f2e3b83333\") " pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:41.998741 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:41.998724 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:42.011521 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:42.011503 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:42.132496 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:42.132467 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-dw5c9"] Apr 16 14:13:42.155174 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:42.155145 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-z4b9p"] Apr 16 14:13:42.158408 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:13:42.158382 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46cf7836_eb7e_4810_a85d_22f2e3b83333.slice/crio-3bcc391703ef3bb4bfa65d2d47cba2fec25f3f5d72012f57c696f61ca5c6fa40 WatchSource:0}: Error finding container 3bcc391703ef3bb4bfa65d2d47cba2fec25f3f5d72012f57c696f61ca5c6fa40: Status 404 returned error can't find the container with id 3bcc391703ef3bb4bfa65d2d47cba2fec25f3f5d72012f57c696f61ca5c6fa40 Apr 16 14:13:42.796111 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:42.796068 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-z4b9p" event={"ID":"46cf7836-eb7e-4810-a85d-22f2e3b83333","Type":"ContainerStarted","Data":"3bcc391703ef3bb4bfa65d2d47cba2fec25f3f5d72012f57c696f61ca5c6fa40"} Apr 16 14:13:42.797519 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:42.797489 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dw5c9" event={"ID":"c3596ae7-10db-4eae-81bb-18f75dccd1fc","Type":"ContainerStarted","Data":"4aca12719468637b7d93d491e055870b6e03d4e279bec5126319fc5311ce6bdf"} Apr 16 14:13:45.807916 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:45.807876 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-z4b9p" event={"ID":"46cf7836-eb7e-4810-a85d-22f2e3b83333","Type":"ContainerStarted","Data":"571ee13f2c76b786d6c0fc15a45ab248547d9fd42a83c579fdf4edbf67f4d3b9"} Apr 16 14:13:45.808410 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:45.808143 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:45.809146 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:45.809122 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-dw5c9" event={"ID":"c3596ae7-10db-4eae-81bb-18f75dccd1fc","Type":"ContainerStarted","Data":"f3cadfa4319a5d29ea863d5f9ceb0af13a60d6af338bee0105c3d4b052be26c1"} Apr 16 14:13:45.809258 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:45.809244 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:13:45.844008 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:45.843960 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-z4b9p" podStartSLOduration=1.906963585 podStartE2EDuration="4.843946821s" podCreationTimestamp="2026-04-16 14:13:41 +0000 UTC" firstStartedPulling="2026-04-16 14:13:42.159582736 +0000 UTC m=+862.353398649" lastFinishedPulling="2026-04-16 14:13:45.09656596 +0000 UTC m=+865.290381885" observedRunningTime="2026-04-16 14:13:45.843099141 +0000 UTC m=+866.036915079" watchObservedRunningTime="2026-04-16 14:13:45.843946821 +0000 UTC m=+866.037762753" Apr 16 14:13:45.867505 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:45.867446 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-dw5c9" podStartSLOduration=1.913753234 podStartE2EDuration="4.867429509s" podCreationTimestamp="2026-04-16 14:13:41 +0000 UTC" firstStartedPulling="2026-04-16 14:13:42.140307329 +0000 UTC m=+862.334123242" lastFinishedPulling="2026-04-16 14:13:45.093983603 +0000 UTC m=+865.287799517" observedRunningTime="2026-04-16 14:13:45.866460776 +0000 UTC m=+866.060276712" watchObservedRunningTime="2026-04-16 14:13:45.867429509 +0000 UTC m=+866.061245445" Apr 16 14:13:56.815211 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:56.815180 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-z4b9p" Apr 16 14:13:56.816952 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:13:56.816934 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-dw5c9" Apr 16 14:14:48.729902 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.729868 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv"] Apr 16 14:14:48.733313 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.733292 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.736503 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.736479 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:14:48.736625 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.736479 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:14:48.736625 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.736483 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 16 14:14:48.737472 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.737454 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:14:48.737724 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.737521 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-rvkxr\"" Apr 16 14:14:48.745485 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.745464 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv"] Apr 16 14:14:48.885340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.885302 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.885340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.885338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.885572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.885356 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4spk7\" (UniqueName: \"kubernetes.io/projected/92d98b1b-4960-47c7-b64d-91a546dba869-kube-api-access-4spk7\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.885572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.885429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.885572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.885482 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92d98b1b-4960-47c7-b64d-91a546dba869-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.885572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.885529 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.986703 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.986618 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.986703 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.986660 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4spk7\" (UniqueName: \"kubernetes.io/projected/92d98b1b-4960-47c7-b64d-91a546dba869-kube-api-access-4spk7\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.986703 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.986698 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.986965 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.986730 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92d98b1b-4960-47c7-b64d-91a546dba869-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.986965 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.986770 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.986965 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.986837 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.987115 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.987015 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.987115 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.987104 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.987221 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.987153 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.987305 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.987232 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.989408 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.989389 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92d98b1b-4960-47c7-b64d-91a546dba869-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:48.995583 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:48.995559 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4spk7\" (UniqueName: \"kubernetes.io/projected/92d98b1b-4960-47c7-b64d-91a546dba869-kube-api-access-4spk7\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:49.045490 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:49.045460 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:14:49.167297 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:49.167248 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv"] Apr 16 14:14:49.172112 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:14:49.172081 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod92d98b1b_4960_47c7_b64d_91a546dba869.slice/crio-ec982eb30618184866d3e64a4952a2f696d5c1da4726c4fa463d1673742a0c67 WatchSource:0}: Error finding container ec982eb30618184866d3e64a4952a2f696d5c1da4726c4fa463d1673742a0c67: Status 404 returned error can't find the container with id ec982eb30618184866d3e64a4952a2f696d5c1da4726c4fa463d1673742a0c67 Apr 16 14:14:50.013821 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:50.013785 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerStarted","Data":"ec982eb30618184866d3e64a4952a2f696d5c1da4726c4fa463d1673742a0c67"} Apr 16 14:14:54.029531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:54.029489 2569 generic.go:358] "Generic (PLEG): container finished" podID="92d98b1b-4960-47c7-b64d-91a546dba869" containerID="e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c" exitCode=0 Apr 16 14:14:54.029905 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:54.029576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerDied","Data":"e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c"} Apr 16 14:14:56.044316 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:14:56.044256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerStarted","Data":"f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632"} Apr 16 14:15:26.150501 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:26.150464 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerStarted","Data":"2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8"} Apr 16 14:15:26.150981 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:26.150692 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:26.153054 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:26.153030 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:26.172377 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:26.172334 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" podStartSLOduration=2.109414753 podStartE2EDuration="38.172321678s" podCreationTimestamp="2026-04-16 14:14:48 +0000 UTC" firstStartedPulling="2026-04-16 14:14:49.174131034 +0000 UTC m=+929.367946948" lastFinishedPulling="2026-04-16 14:15:25.237037951 +0000 UTC m=+965.430853873" observedRunningTime="2026-04-16 14:15:26.17062046 +0000 UTC m=+966.364436397" watchObservedRunningTime="2026-04-16 14:15:26.172321678 +0000 UTC m=+966.366137613" Apr 16 14:15:29.046551 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:29.046513 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:29.046960 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:29.046568 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:39.048010 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:39.047930 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:39.049224 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:39.049203 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:43.498127 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:43.498093 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv"] Apr 16 14:15:43.498739 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:43.498701 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="main" containerID="cri-o://f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632" gracePeriod=30 Apr 16 14:15:43.498877 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:43.498810 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="tokenizer" containerID="cri-o://2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8" gracePeriod=30 Apr 16 14:15:44.218895 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.218863 2569 generic.go:358] "Generic (PLEG): container finished" podID="92d98b1b-4960-47c7-b64d-91a546dba869" containerID="f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632" exitCode=0 Apr 16 14:15:44.219073 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.218940 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerDied","Data":"f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632"} Apr 16 14:15:44.735879 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.735855 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:44.847315 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847200 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4spk7\" (UniqueName: \"kubernetes.io/projected/92d98b1b-4960-47c7-b64d-91a546dba869-kube-api-access-4spk7\") pod \"92d98b1b-4960-47c7-b64d-91a546dba869\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " Apr 16 14:15:44.847315 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847298 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-kserve-provision-location\") pod \"92d98b1b-4960-47c7-b64d-91a546dba869\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " Apr 16 14:15:44.847544 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847331 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-uds\") pod \"92d98b1b-4960-47c7-b64d-91a546dba869\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " Apr 16 14:15:44.847544 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847408 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92d98b1b-4960-47c7-b64d-91a546dba869-tls-certs\") pod \"92d98b1b-4960-47c7-b64d-91a546dba869\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " Apr 16 14:15:44.847544 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847440 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-tmp\") pod \"92d98b1b-4960-47c7-b64d-91a546dba869\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " Apr 16 14:15:44.847544 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847466 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-cache\") pod \"92d98b1b-4960-47c7-b64d-91a546dba869\" (UID: \"92d98b1b-4960-47c7-b64d-91a546dba869\") " Apr 16 14:15:44.847718 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847666 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "92d98b1b-4960-47c7-b64d-91a546dba869" (UID: "92d98b1b-4960-47c7-b64d-91a546dba869"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:44.847788 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847764 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "92d98b1b-4960-47c7-b64d-91a546dba869" (UID: "92d98b1b-4960-47c7-b64d-91a546dba869"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:44.847830 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.847785 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "92d98b1b-4960-47c7-b64d-91a546dba869" (UID: "92d98b1b-4960-47c7-b64d-91a546dba869"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:44.848031 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.848011 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "92d98b1b-4960-47c7-b64d-91a546dba869" (UID: "92d98b1b-4960-47c7-b64d-91a546dba869"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:15:44.849534 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.849510 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/92d98b1b-4960-47c7-b64d-91a546dba869-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "92d98b1b-4960-47c7-b64d-91a546dba869" (UID: "92d98b1b-4960-47c7-b64d-91a546dba869"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:15:44.849608 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.849513 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/92d98b1b-4960-47c7-b64d-91a546dba869-kube-api-access-4spk7" (OuterVolumeSpecName: "kube-api-access-4spk7") pod "92d98b1b-4960-47c7-b64d-91a546dba869" (UID: "92d98b1b-4960-47c7-b64d-91a546dba869"). InnerVolumeSpecName "kube-api-access-4spk7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:15:44.948219 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.948179 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:15:44.948219 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.948214 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:15:44.948219 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.948226 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/92d98b1b-4960-47c7-b64d-91a546dba869-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:15:44.948476 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.948234 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:15:44.948476 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.948243 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/92d98b1b-4960-47c7-b64d-91a546dba869-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:15:44.948476 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:44.948252 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4spk7\" (UniqueName: \"kubernetes.io/projected/92d98b1b-4960-47c7-b64d-91a546dba869-kube-api-access-4spk7\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:15:45.223594 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.223506 2569 generic.go:358] "Generic (PLEG): container finished" podID="92d98b1b-4960-47c7-b64d-91a546dba869" containerID="2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8" exitCode=0 Apr 16 14:15:45.223594 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.223589 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" Apr 16 14:15:45.223790 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.223590 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerDied","Data":"2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8"} Apr 16 14:15:45.223790 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.223631 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv" event={"ID":"92d98b1b-4960-47c7-b64d-91a546dba869","Type":"ContainerDied","Data":"ec982eb30618184866d3e64a4952a2f696d5c1da4726c4fa463d1673742a0c67"} Apr 16 14:15:45.223790 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.223647 2569 scope.go:117] "RemoveContainer" containerID="2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8" Apr 16 14:15:45.231469 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.231449 2569 scope.go:117] "RemoveContainer" containerID="f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632" Apr 16 14:15:45.238477 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.238461 2569 scope.go:117] "RemoveContainer" containerID="e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c" Apr 16 14:15:45.245716 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.245667 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv"] Apr 16 14:15:45.246563 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.246470 2569 scope.go:117] "RemoveContainer" containerID="2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8" Apr 16 14:15:45.246840 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:15:45.246821 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8\": container with ID starting with 2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8 not found: ID does not exist" containerID="2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8" Apr 16 14:15:45.246900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.246850 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8"} err="failed to get container status \"2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8\": rpc error: code = NotFound desc = could not find container \"2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8\": container with ID starting with 2ef78afa6362b6c7a13c0cc1e5d75e6dd74ddd02ce2a6a493ca41517eca999f8 not found: ID does not exist" Apr 16 14:15:45.246900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.246869 2569 scope.go:117] "RemoveContainer" containerID="f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632" Apr 16 14:15:45.247145 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:15:45.247117 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632\": container with ID starting with f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632 not found: ID does not exist" containerID="f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632" Apr 16 14:15:45.247188 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.247151 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632"} err="failed to get container status \"f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632\": rpc error: code = NotFound desc = could not find container \"f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632\": container with ID starting with f3e3d85fb969353aa121ceefd74f2f6073bfe530eebf55585dff5dc2ecef3632 not found: ID does not exist" Apr 16 14:15:45.247188 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.247165 2569 scope.go:117] "RemoveContainer" containerID="e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c" Apr 16 14:15:45.247892 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.247862 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-754c4jxnsv"] Apr 16 14:15:45.247953 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:15:45.247911 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c\": container with ID starting with e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c not found: ID does not exist" containerID="e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c" Apr 16 14:15:45.247953 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:45.247928 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c"} err="failed to get container status \"e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c\": rpc error: code = NotFound desc = could not find container \"e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c\": container with ID starting with e62e7d66ff8b2de850c3246db990f76ce5d389bdfc32d89d206139b77f03931c not found: ID does not exist" Apr 16 14:15:46.348444 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:46.348412 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" path="/var/lib/kubelet/pods/92d98b1b-4960-47c7-b64d-91a546dba869/volumes" Apr 16 14:15:50.632387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632351 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88"] Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632687 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="main" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632710 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="main" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632737 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="tokenizer" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632748 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="tokenizer" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632769 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="storage-initializer" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632778 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="storage-initializer" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632862 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="main" Apr 16 14:15:50.633034 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.632875 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="92d98b1b-4960-47c7-b64d-91a546dba869" containerName="tokenizer" Apr 16 14:15:50.912000 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.911924 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88"] Apr 16 14:15:50.912000 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.911955 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj"] Apr 16 14:15:50.912223 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.912107 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:50.914933 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.914909 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:15:50.915059 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.914908 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 16 14:15:50.915059 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.914916 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:15:50.915168 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.914919 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:15:50.943581 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.943550 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj"] Apr 16 14:15:50.943785 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.943763 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.946436 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.946409 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-4l6l9\"" Apr 16 14:15:50.999399 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999364 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-dshm\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:50.999399 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999406 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:50.999651 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999489 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.999651 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999546 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.999651 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999594 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.999770 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999667 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5746ab49-9a09-460b-9049-ee873195b2b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:50.999770 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999703 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.999770 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999727 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-home\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:50.999920 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999807 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26b64\" (UniqueName: \"kubernetes.io/projected/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kube-api-access-26b64\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.999920 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999843 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:50.999920 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999872 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lttwn\" (UniqueName: \"kubernetes.io/projected/5746ab49-9a09-460b-9049-ee873195b2b2-kube-api-access-lttwn\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:50.999920 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:50.999904 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101030 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.100998 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101030 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101033 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101060 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101086 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5746ab49-9a09-460b-9049-ee873195b2b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101150 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-home\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101196 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26b64\" (UniqueName: \"kubernetes.io/projected/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kube-api-access-26b64\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101220 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101238 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lttwn\" (UniqueName: \"kubernetes.io/projected/5746ab49-9a09-460b-9049-ee873195b2b2-kube-api-access-lttwn\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101278 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101301 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-dshm\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101316 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-model-cache\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101642 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.101668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101558 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-home\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.101987 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.101853 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.102046 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.102023 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.103638 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.103617 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-dshm\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.103785 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.103766 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5746ab49-9a09-460b-9049-ee873195b2b2-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.103822 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.103786 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.109924 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.109901 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lttwn\" (UniqueName: \"kubernetes.io/projected/5746ab49-9a09-460b-9049-ee873195b2b2-kube-api-access-lttwn\") pod \"scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.110059 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.110039 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26b64\" (UniqueName: \"kubernetes.io/projected/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kube-api-access-26b64\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.225237 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.225149 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:15:51.253503 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.253463 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:51.362539 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.362505 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88"] Apr 16 14:15:51.367972 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:15:51.367926 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5746ab49_9a09_460b_9049_ee873195b2b2.slice/crio-00cd977c9a7df976dde18778493048decebe751eb2291a38e22a3b37e4f35b11 WatchSource:0}: Error finding container 00cd977c9a7df976dde18778493048decebe751eb2291a38e22a3b37e4f35b11: Status 404 returned error can't find the container with id 00cd977c9a7df976dde18778493048decebe751eb2291a38e22a3b37e4f35b11 Apr 16 14:15:51.390996 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:51.390968 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj"] Apr 16 14:15:51.394650 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:15:51.394624 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1f730d0f_c09d_4052_8d36_1b25b5e834cc.slice/crio-669477caab63c9624570481211b3e7d53987cb9c3b0545f538ce9fe7d7404a2e WatchSource:0}: Error finding container 669477caab63c9624570481211b3e7d53987cb9c3b0545f538ce9fe7d7404a2e: Status 404 returned error can't find the container with id 669477caab63c9624570481211b3e7d53987cb9c3b0545f538ce9fe7d7404a2e Apr 16 14:15:52.249335 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:52.249298 2569 generic.go:358] "Generic (PLEG): container finished" podID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerID="7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4" exitCode=0 Apr 16 14:15:52.249755 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:52.249371 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerDied","Data":"7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4"} Apr 16 14:15:52.249755 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:52.249406 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerStarted","Data":"669477caab63c9624570481211b3e7d53987cb9c3b0545f538ce9fe7d7404a2e"} Apr 16 14:15:52.250940 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:52.250913 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" event={"ID":"5746ab49-9a09-460b-9049-ee873195b2b2","Type":"ContainerStarted","Data":"93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398"} Apr 16 14:15:52.250940 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:52.250949 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" event={"ID":"5746ab49-9a09-460b-9049-ee873195b2b2","Type":"ContainerStarted","Data":"00cd977c9a7df976dde18778493048decebe751eb2291a38e22a3b37e4f35b11"} Apr 16 14:15:53.256723 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:53.256687 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerStarted","Data":"df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c"} Apr 16 14:15:53.256723 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:53.256730 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerStarted","Data":"5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f"} Apr 16 14:15:53.281872 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:53.281823 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" podStartSLOduration=3.2818092549999998 podStartE2EDuration="3.281809255s" podCreationTimestamp="2026-04-16 14:15:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:15:53.280413417 +0000 UTC m=+993.474229379" watchObservedRunningTime="2026-04-16 14:15:53.281809255 +0000 UTC m=+993.475625189" Apr 16 14:15:54.260917 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:54.260881 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:15:56.270725 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:56.270691 2569 generic.go:358] "Generic (PLEG): container finished" podID="5746ab49-9a09-460b-9049-ee873195b2b2" containerID="93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398" exitCode=0 Apr 16 14:15:56.271081 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:56.270769 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" event={"ID":"5746ab49-9a09-460b-9049-ee873195b2b2","Type":"ContainerDied","Data":"93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398"} Apr 16 14:15:58.279965 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:58.279932 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" event={"ID":"5746ab49-9a09-460b-9049-ee873195b2b2","Type":"ContainerStarted","Data":"090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c"} Apr 16 14:15:58.298298 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:15:58.298227 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" podStartSLOduration=6.626519315 podStartE2EDuration="8.298212066s" podCreationTimestamp="2026-04-16 14:15:50 +0000 UTC" firstStartedPulling="2026-04-16 14:15:56.272023395 +0000 UTC m=+996.465839322" lastFinishedPulling="2026-04-16 14:15:57.943716156 +0000 UTC m=+998.137532073" observedRunningTime="2026-04-16 14:15:58.29742699 +0000 UTC m=+998.491242929" watchObservedRunningTime="2026-04-16 14:15:58.298212066 +0000 UTC m=+998.492028002" Apr 16 14:16:01.226098 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.226062 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:16:01.226602 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.226108 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:16:01.239083 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.239059 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:16:01.254356 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.254330 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:01.254500 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.254368 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:01.257184 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.257159 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:01.291327 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.291299 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:01.301449 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:01.301429 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:16:19.637923 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.637890 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594"] Apr 16 14:16:19.641797 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.641770 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.644588 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.644563 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 16 14:16:19.644743 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.644629 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-bgqr4\"" Apr 16 14:16:19.652496 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.652469 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594"] Apr 16 14:16:19.738545 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.738517 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d017164c-1463-401e-81bb-702a2d20ca62-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.738545 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.738553 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.738746 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.738575 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sn6mf\" (UniqueName: \"kubernetes.io/projected/d017164c-1463-401e-81bb-702a2d20ca62-kube-api-access-sn6mf\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.738746 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.738675 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.738746 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.738707 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.738746 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.738730 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.839880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.839847 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.839880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.839886 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840088 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.839911 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840088 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.839966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d017164c-1463-401e-81bb-702a2d20ca62-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840088 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.840000 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840244 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.840120 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sn6mf\" (UniqueName: \"kubernetes.io/projected/d017164c-1463-401e-81bb-702a2d20ca62-kube-api-access-sn6mf\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840328 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.840311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840410 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.840386 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840543 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.840434 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.840633 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.840616 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.842711 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.842688 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d017164c-1463-401e-81bb-702a2d20ca62-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.848843 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.848812 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sn6mf\" (UniqueName: \"kubernetes.io/projected/d017164c-1463-401e-81bb-702a2d20ca62-kube-api-access-sn6mf\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:19.952185 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:19.952110 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:20.083065 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:20.083031 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594"] Apr 16 14:16:20.086988 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:16:20.086961 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd017164c_1463_401e_81bb_702a2d20ca62.slice/crio-d486ab482bf25c85f38b1babafa71044b9a9936e7510d1a84a314f2f57903050 WatchSource:0}: Error finding container d486ab482bf25c85f38b1babafa71044b9a9936e7510d1a84a314f2f57903050: Status 404 returned error can't find the container with id d486ab482bf25c85f38b1babafa71044b9a9936e7510d1a84a314f2f57903050 Apr 16 14:16:20.355169 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:20.355137 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerStarted","Data":"22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2"} Apr 16 14:16:20.355169 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:20.355172 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerStarted","Data":"d486ab482bf25c85f38b1babafa71044b9a9936e7510d1a84a314f2f57903050"} Apr 16 14:16:21.359641 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:21.359602 2569 generic.go:358] "Generic (PLEG): container finished" podID="d017164c-1463-401e-81bb-702a2d20ca62" containerID="22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2" exitCode=0 Apr 16 14:16:21.360080 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:21.359669 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerDied","Data":"22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2"} Apr 16 14:16:22.294826 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:22.294798 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:22.365696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:22.365663 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerStarted","Data":"6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771"} Apr 16 14:16:22.365696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:22.365702 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerStarted","Data":"ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df"} Apr 16 14:16:22.366186 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:22.365912 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:22.388346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:22.388301 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" podStartSLOduration=3.388284752 podStartE2EDuration="3.388284752s" podCreationTimestamp="2026-04-16 14:16:19 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:16:22.386373976 +0000 UTC m=+1022.580189937" watchObservedRunningTime="2026-04-16 14:16:22.388284752 +0000 UTC m=+1022.582100681" Apr 16 14:16:29.952242 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:29.952200 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:29.952795 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:29.952325 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:29.955081 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:29.955060 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:30.392642 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:30.392611 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:16:38.472363 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.472329 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj"] Apr 16 14:16:38.473255 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.473200 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="main" containerID="cri-o://5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f" gracePeriod=30 Apr 16 14:16:38.474055 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.473729 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="tokenizer" containerID="cri-o://df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c" gracePeriod=30 Apr 16 14:16:38.484187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.484095 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88"] Apr 16 14:16:38.484640 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.484611 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" containerName="main" containerID="cri-o://090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c" gracePeriod=30 Apr 16 14:16:38.725693 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.725635 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:16:38.799222 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.799190 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-dshm\") pod \"5746ab49-9a09-460b-9049-ee873195b2b2\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " Apr 16 14:16:38.799397 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.799254 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-kserve-provision-location\") pod \"5746ab49-9a09-460b-9049-ee873195b2b2\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " Apr 16 14:16:38.799397 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.799316 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lttwn\" (UniqueName: \"kubernetes.io/projected/5746ab49-9a09-460b-9049-ee873195b2b2-kube-api-access-lttwn\") pod \"5746ab49-9a09-460b-9049-ee873195b2b2\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " Apr 16 14:16:38.799397 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.799387 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5746ab49-9a09-460b-9049-ee873195b2b2-tls-certs\") pod \"5746ab49-9a09-460b-9049-ee873195b2b2\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " Apr 16 14:16:38.799562 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.799417 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-model-cache\") pod \"5746ab49-9a09-460b-9049-ee873195b2b2\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " Apr 16 14:16:38.799562 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.799452 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-home\") pod \"5746ab49-9a09-460b-9049-ee873195b2b2\" (UID: \"5746ab49-9a09-460b-9049-ee873195b2b2\") " Apr 16 14:16:38.800331 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.800263 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-model-cache" (OuterVolumeSpecName: "model-cache") pod "5746ab49-9a09-460b-9049-ee873195b2b2" (UID: "5746ab49-9a09-460b-9049-ee873195b2b2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:38.800699 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.800679 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-home" (OuterVolumeSpecName: "home") pod "5746ab49-9a09-460b-9049-ee873195b2b2" (UID: "5746ab49-9a09-460b-9049-ee873195b2b2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:38.806461 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.806422 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5746ab49-9a09-460b-9049-ee873195b2b2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5746ab49-9a09-460b-9049-ee873195b2b2" (UID: "5746ab49-9a09-460b-9049-ee873195b2b2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:16:38.808025 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.807749 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5746ab49-9a09-460b-9049-ee873195b2b2-kube-api-access-lttwn" (OuterVolumeSpecName: "kube-api-access-lttwn") pod "5746ab49-9a09-460b-9049-ee873195b2b2" (UID: "5746ab49-9a09-460b-9049-ee873195b2b2"). InnerVolumeSpecName "kube-api-access-lttwn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:16:38.810196 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.810168 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-dshm" (OuterVolumeSpecName: "dshm") pod "5746ab49-9a09-460b-9049-ee873195b2b2" (UID: "5746ab49-9a09-460b-9049-ee873195b2b2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:38.861518 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.861470 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5746ab49-9a09-460b-9049-ee873195b2b2" (UID: "5746ab49-9a09-460b-9049-ee873195b2b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:38.901124 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.901094 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:38.901124 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.901123 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lttwn\" (UniqueName: \"kubernetes.io/projected/5746ab49-9a09-460b-9049-ee873195b2b2-kube-api-access-lttwn\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:38.901356 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.901135 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5746ab49-9a09-460b-9049-ee873195b2b2-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:38.901356 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.901147 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-model-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:38.901356 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.901156 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-home\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:38.901356 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:38.901163 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/5746ab49-9a09-460b-9049-ee873195b2b2-dshm\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:39.421428 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.421397 2569 generic.go:358] "Generic (PLEG): container finished" podID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerID="5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f" exitCode=0 Apr 16 14:16:39.421582 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.421474 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerDied","Data":"5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f"} Apr 16 14:16:39.422780 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.422757 2569 generic.go:358] "Generic (PLEG): container finished" podID="5746ab49-9a09-460b-9049-ee873195b2b2" containerID="090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c" exitCode=0 Apr 16 14:16:39.422900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.422788 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" event={"ID":"5746ab49-9a09-460b-9049-ee873195b2b2","Type":"ContainerDied","Data":"090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c"} Apr 16 14:16:39.422900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.422820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" event={"ID":"5746ab49-9a09-460b-9049-ee873195b2b2","Type":"ContainerDied","Data":"00cd977c9a7df976dde18778493048decebe751eb2291a38e22a3b37e4f35b11"} Apr 16 14:16:39.422900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.422833 2569 scope.go:117] "RemoveContainer" containerID="090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c" Apr 16 14:16:39.422900 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.422878 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88" Apr 16 14:16:39.431084 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.431068 2569 scope.go:117] "RemoveContainer" containerID="93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398" Apr 16 14:16:39.440692 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.440676 2569 scope.go:117] "RemoveContainer" containerID="090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c" Apr 16 14:16:39.440910 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:16:39.440893 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c\": container with ID starting with 090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c not found: ID does not exist" containerID="090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c" Apr 16 14:16:39.440974 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.440921 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c"} err="failed to get container status \"090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c\": rpc error: code = NotFound desc = could not find container \"090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c\": container with ID starting with 090514aac00f630dffd98e64084e786cdaf6932cbc2737708c4b34be807ea36c not found: ID does not exist" Apr 16 14:16:39.440974 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.440944 2569 scope.go:117] "RemoveContainer" containerID="93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398" Apr 16 14:16:39.441188 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:16:39.441163 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398\": container with ID starting with 93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398 not found: ID does not exist" containerID="93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398" Apr 16 14:16:39.441231 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.441198 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398"} err="failed to get container status \"93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398\": rpc error: code = NotFound desc = could not find container \"93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398\": container with ID starting with 93947c9d13787be95172a590109fda123a4fb8558f9136efee5027465dab9398 not found: ID does not exist" Apr 16 14:16:39.444952 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.444933 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88"] Apr 16 14:16:39.448954 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.448934 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-68b76b6d9b-xjl88"] Apr 16 14:16:39.729611 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.729580 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:39.809179 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809153 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26b64\" (UniqueName: \"kubernetes.io/projected/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kube-api-access-26b64\") pod \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " Apr 16 14:16:39.809179 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809181 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tls-certs\") pod \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " Apr 16 14:16:39.809406 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809205 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-uds\") pod \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " Apr 16 14:16:39.809406 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809242 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kserve-provision-location\") pod \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " Apr 16 14:16:39.809406 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809312 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-cache\") pod \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " Apr 16 14:16:39.809406 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809342 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-tmp\") pod \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\" (UID: \"1f730d0f-c09d-4052-8d36-1b25b5e834cc\") " Apr 16 14:16:39.809613 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809443 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "1f730d0f-c09d-4052-8d36-1b25b5e834cc" (UID: "1f730d0f-c09d-4052-8d36-1b25b5e834cc"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.809613 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809521 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:39.809706 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809638 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "1f730d0f-c09d-4052-8d36-1b25b5e834cc" (UID: "1f730d0f-c09d-4052-8d36-1b25b5e834cc"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.809752 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809714 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "1f730d0f-c09d-4052-8d36-1b25b5e834cc" (UID: "1f730d0f-c09d-4052-8d36-1b25b5e834cc"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.810009 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.809983 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "1f730d0f-c09d-4052-8d36-1b25b5e834cc" (UID: "1f730d0f-c09d-4052-8d36-1b25b5e834cc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:16:39.811467 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.811440 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "1f730d0f-c09d-4052-8d36-1b25b5e834cc" (UID: "1f730d0f-c09d-4052-8d36-1b25b5e834cc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:16:39.811467 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.811449 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kube-api-access-26b64" (OuterVolumeSpecName: "kube-api-access-26b64") pod "1f730d0f-c09d-4052-8d36-1b25b5e834cc" (UID: "1f730d0f-c09d-4052-8d36-1b25b5e834cc"). InnerVolumeSpecName "kube-api-access-26b64". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:16:39.910872 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.910834 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:39.910872 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.910867 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:39.910872 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.910877 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:39.911098 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.910886 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26b64\" (UniqueName: \"kubernetes.io/projected/1f730d0f-c09d-4052-8d36-1b25b5e834cc-kube-api-access-26b64\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:39.911098 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:39.910896 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/1f730d0f-c09d-4052-8d36-1b25b5e834cc-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:16:40.348444 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.348410 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" path="/var/lib/kubelet/pods/5746ab49-9a09-460b-9049-ee873195b2b2/volumes" Apr 16 14:16:40.428191 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.428155 2569 generic.go:358] "Generic (PLEG): container finished" podID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerID="df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c" exitCode=0 Apr 16 14:16:40.428360 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.428222 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" Apr 16 14:16:40.428360 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.428255 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerDied","Data":"df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c"} Apr 16 14:16:40.428360 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.428309 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj" event={"ID":"1f730d0f-c09d-4052-8d36-1b25b5e834cc","Type":"ContainerDied","Data":"669477caab63c9624570481211b3e7d53987cb9c3b0545f538ce9fe7d7404a2e"} Apr 16 14:16:40.428360 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.428332 2569 scope.go:117] "RemoveContainer" containerID="df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c" Apr 16 14:16:40.435823 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.435805 2569 scope.go:117] "RemoveContainer" containerID="5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f" Apr 16 14:16:40.442762 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.442747 2569 scope.go:117] "RemoveContainer" containerID="7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4" Apr 16 14:16:40.449483 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.449457 2569 scope.go:117] "RemoveContainer" containerID="df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c" Apr 16 14:16:40.449713 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:16:40.449695 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c\": container with ID starting with df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c not found: ID does not exist" containerID="df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c" Apr 16 14:16:40.449764 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.449720 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c"} err="failed to get container status \"df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c\": rpc error: code = NotFound desc = could not find container \"df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c\": container with ID starting with df2d1a9c06a88f5cd804002fa10d425200b770f1ae850550f31027c960aa8b8c not found: ID does not exist" Apr 16 14:16:40.449764 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.449737 2569 scope.go:117] "RemoveContainer" containerID="5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f" Apr 16 14:16:40.449977 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:16:40.449958 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f\": container with ID starting with 5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f not found: ID does not exist" containerID="5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f" Apr 16 14:16:40.450020 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.449983 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f"} err="failed to get container status \"5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f\": rpc error: code = NotFound desc = could not find container \"5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f\": container with ID starting with 5cc89b2db8e99190f4cc2d33a562690724763662552cc4ffe5d7ff672068751f not found: ID does not exist" Apr 16 14:16:40.450020 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.450000 2569 scope.go:117] "RemoveContainer" containerID="7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4" Apr 16 14:16:40.450239 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:16:40.450220 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4\": container with ID starting with 7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4 not found: ID does not exist" containerID="7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4" Apr 16 14:16:40.450306 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.450244 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4"} err="failed to get container status \"7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4\": rpc error: code = NotFound desc = could not find container \"7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4\": container with ID starting with 7067c5b5efe0e49a9c37b3ed037e6d312f5af96e3daec5ebd4ef6009a3628ff4 not found: ID does not exist" Apr 16 14:16:40.468418 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.468398 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj"] Apr 16 14:16:40.476695 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:40.476675 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-788fcf8t25kj"] Apr 16 14:16:42.348119 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:42.348087 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" path="/var/lib/kubelet/pods/1f730d0f-c09d-4052-8d36-1b25b5e834cc/volumes" Apr 16 14:16:52.399194 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:16:52.399167 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:18:45.544320 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:45.544214 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594"] Apr 16 14:18:45.544793 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:45.544580 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="main" containerID="cri-o://ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df" gracePeriod=30 Apr 16 14:18:45.544793 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:45.544635 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="tokenizer" containerID="cri-o://6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771" gracePeriod=30 Apr 16 14:18:45.832379 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:45.832296 2569 generic.go:358] "Generic (PLEG): container finished" podID="d017164c-1463-401e-81bb-702a2d20ca62" containerID="ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df" exitCode=0 Apr 16 14:18:45.832515 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:45.832308 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerDied","Data":"ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df"} Apr 16 14:18:46.699414 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.699395 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:18:46.723027 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723003 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-cache\") pod \"d017164c-1463-401e-81bb-702a2d20ca62\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " Apr 16 14:18:46.723178 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723040 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d017164c-1463-401e-81bb-702a2d20ca62-tls-certs\") pod \"d017164c-1463-401e-81bb-702a2d20ca62\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " Apr 16 14:18:46.723178 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723066 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-tmp\") pod \"d017164c-1463-401e-81bb-702a2d20ca62\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " Apr 16 14:18:46.723355 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723328 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "d017164c-1463-401e-81bb-702a2d20ca62" (UID: "d017164c-1463-401e-81bb-702a2d20ca62"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:46.723434 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723386 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-uds\") pod \"d017164c-1463-401e-81bb-702a2d20ca62\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " Apr 16 14:18:46.723434 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723421 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sn6mf\" (UniqueName: \"kubernetes.io/projected/d017164c-1463-401e-81bb-702a2d20ca62-kube-api-access-sn6mf\") pod \"d017164c-1463-401e-81bb-702a2d20ca62\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " Apr 16 14:18:46.723434 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723418 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "d017164c-1463-401e-81bb-702a2d20ca62" (UID: "d017164c-1463-401e-81bb-702a2d20ca62"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:46.723588 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723472 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-kserve-provision-location\") pod \"d017164c-1463-401e-81bb-702a2d20ca62\" (UID: \"d017164c-1463-401e-81bb-702a2d20ca62\") " Apr 16 14:18:46.723696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723643 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "d017164c-1463-401e-81bb-702a2d20ca62" (UID: "d017164c-1463-401e-81bb-702a2d20ca62"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:46.723804 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723786 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:18:46.723880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723806 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:18:46.723880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.723820 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:18:46.724309 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.724281 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d017164c-1463-401e-81bb-702a2d20ca62" (UID: "d017164c-1463-401e-81bb-702a2d20ca62"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:18:46.725640 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.725608 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d017164c-1463-401e-81bb-702a2d20ca62-kube-api-access-sn6mf" (OuterVolumeSpecName: "kube-api-access-sn6mf") pod "d017164c-1463-401e-81bb-702a2d20ca62" (UID: "d017164c-1463-401e-81bb-702a2d20ca62"). InnerVolumeSpecName "kube-api-access-sn6mf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:18:46.725957 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.725934 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d017164c-1463-401e-81bb-702a2d20ca62-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d017164c-1463-401e-81bb-702a2d20ca62" (UID: "d017164c-1463-401e-81bb-702a2d20ca62"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:18:46.824141 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.824083 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d017164c-1463-401e-81bb-702a2d20ca62-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:18:46.824141 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.824103 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sn6mf\" (UniqueName: \"kubernetes.io/projected/d017164c-1463-401e-81bb-702a2d20ca62-kube-api-access-sn6mf\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:18:46.824141 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.824114 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d017164c-1463-401e-81bb-702a2d20ca62-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:18:46.837711 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.837683 2569 generic.go:358] "Generic (PLEG): container finished" podID="d017164c-1463-401e-81bb-702a2d20ca62" containerID="6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771" exitCode=0 Apr 16 14:18:46.837828 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.837762 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" Apr 16 14:18:46.837828 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.837764 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerDied","Data":"6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771"} Apr 16 14:18:46.837828 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.837804 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594" event={"ID":"d017164c-1463-401e-81bb-702a2d20ca62","Type":"ContainerDied","Data":"d486ab482bf25c85f38b1babafa71044b9a9936e7510d1a84a314f2f57903050"} Apr 16 14:18:46.837828 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.837823 2569 scope.go:117] "RemoveContainer" containerID="6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771" Apr 16 14:18:46.846077 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.846062 2569 scope.go:117] "RemoveContainer" containerID="ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df" Apr 16 14:18:46.853121 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.853106 2569 scope.go:117] "RemoveContainer" containerID="22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2" Apr 16 14:18:46.859309 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.859284 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594"] Apr 16 14:18:46.860569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.860441 2569 scope.go:117] "RemoveContainer" containerID="6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771" Apr 16 14:18:46.860961 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:18:46.860936 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771\": container with ID starting with 6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771 not found: ID does not exist" containerID="6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771" Apr 16 14:18:46.861043 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.860970 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771"} err="failed to get container status \"6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771\": rpc error: code = NotFound desc = could not find container \"6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771\": container with ID starting with 6ef5e658c238b766ebb3c2b63c7cac70b7874de17cb46f38c7699f56479dd771 not found: ID does not exist" Apr 16 14:18:46.861043 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.861010 2569 scope.go:117] "RemoveContainer" containerID="ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df" Apr 16 14:18:46.861319 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:18:46.861296 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df\": container with ID starting with ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df not found: ID does not exist" containerID="ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df" Apr 16 14:18:46.861385 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.861329 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df"} err="failed to get container status \"ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df\": rpc error: code = NotFound desc = could not find container \"ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df\": container with ID starting with ffd017623abb252543041a46c635c62cdf2f5d068415954220942f3caa3940df not found: ID does not exist" Apr 16 14:18:46.861385 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.861354 2569 scope.go:117] "RemoveContainer" containerID="22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2" Apr 16 14:18:46.861627 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:18:46.861607 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2\": container with ID starting with 22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2 not found: ID does not exist" containerID="22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2" Apr 16 14:18:46.861688 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.861635 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2"} err="failed to get container status \"22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2\": rpc error: code = NotFound desc = could not find container \"22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2\": container with ID starting with 22f40cfa182c053be5c176dfa0340ca62b78a823549a4a63ab2e78f614835cb2 not found: ID does not exist" Apr 16 14:18:46.862237 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:46.862221 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-sche8g594"] Apr 16 14:18:48.347569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:48.347539 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d017164c-1463-401e-81bb-702a2d20ca62" path="/var/lib/kubelet/pods/d017164c-1463-401e-81bb-702a2d20ca62/volumes" Apr 16 14:18:55.363632 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.363599 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49"] Apr 16 14:18:55.364026 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364011 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="storage-initializer" Apr 16 14:18:55.364074 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364029 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="storage-initializer" Apr 16 14:18:55.364074 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364044 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="tokenizer" Apr 16 14:18:55.364074 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364053 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="tokenizer" Apr 16 14:18:55.364074 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364065 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="storage-initializer" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364074 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="storage-initializer" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364087 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" containerName="main" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364096 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" containerName="main" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364112 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="tokenizer" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364120 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="tokenizer" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364129 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="main" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364137 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="main" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364150 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" containerName="storage-initializer" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364158 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" containerName="storage-initializer" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364175 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="main" Apr 16 14:18:55.364200 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364184 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="main" Apr 16 14:18:55.364529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364256 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="main" Apr 16 14:18:55.364529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364283 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="1f730d0f-c09d-4052-8d36-1b25b5e834cc" containerName="tokenizer" Apr 16 14:18:55.364529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364295 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="5746ab49-9a09-460b-9049-ee873195b2b2" containerName="main" Apr 16 14:18:55.364529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364305 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="tokenizer" Apr 16 14:18:55.364529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.364315 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="d017164c-1463-401e-81bb-702a2d20ca62" containerName="main" Apr 16 14:18:55.367739 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.367717 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.371746 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.371724 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 16 14:18:55.371871 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.371734 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:18:55.371871 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.371734 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:18:55.371871 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.371776 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:18:55.372037 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.371795 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-7gcxz\"" Apr 16 14:18:55.378122 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.378094 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49"] Apr 16 14:18:55.390450 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.390419 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.390572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.390462 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.390572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.390542 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bld6n\" (UniqueName: \"kubernetes.io/projected/17b6a48c-64ec-4d92-9c69-e876fc4df685-kube-api-access-bld6n\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.390698 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.390603 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.390698 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.390630 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/17b6a48c-64ec-4d92-9c69-e876fc4df685-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.390796 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.390705 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.491844 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.491812 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bld6n\" (UniqueName: \"kubernetes.io/projected/17b6a48c-64ec-4d92-9c69-e876fc4df685-kube-api-access-bld6n\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.491844 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.491850 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492046 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.491875 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/17b6a48c-64ec-4d92-9c69-e876fc4df685-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492046 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.491907 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492046 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.491966 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492198 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.492107 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.492242 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492350 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.492329 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.492361 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.492475 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.492458 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.494559 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.494538 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/17b6a48c-64ec-4d92-9c69-e876fc4df685-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.513067 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.513038 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bld6n\" (UniqueName: \"kubernetes.io/projected/17b6a48c-64ec-4d92-9c69-e876fc4df685-kube-api-access-bld6n\") pod \"custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.677064 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.676981 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:55.805592 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.803080 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49"] Apr 16 14:18:55.809380 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.809355 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:18:55.870570 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.870536 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerStarted","Data":"96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f"} Apr 16 14:18:55.870695 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:55.870575 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerStarted","Data":"7ff2a570e6bcf9a3153b929208f880ef8ba784dfb1cffe8f61cf5875ef216634"} Apr 16 14:18:56.875862 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:56.875780 2569 generic.go:358] "Generic (PLEG): container finished" podID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerID="96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f" exitCode=0 Apr 16 14:18:56.876222 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:56.875847 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerDied","Data":"96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f"} Apr 16 14:18:57.881328 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:57.881290 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerStarted","Data":"0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265"} Apr 16 14:18:57.881328 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:57.881331 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerStarted","Data":"80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2"} Apr 16 14:18:57.881766 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:57.881453 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:18:57.904484 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:18:57.904442 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" podStartSLOduration=2.904428358 podStartE2EDuration="2.904428358s" podCreationTimestamp="2026-04-16 14:18:55 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:18:57.902644877 +0000 UTC m=+1178.096460825" watchObservedRunningTime="2026-04-16 14:18:57.904428358 +0000 UTC m=+1178.098244586" Apr 16 14:19:05.677873 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:19:05.677836 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:19:05.677873 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:19:05.677880 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:19:05.680486 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:19:05.680463 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:19:05.910934 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:19:05.910909 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:19:26.916136 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:19:26.916104 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:20:41.927431 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:41.927401 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49"] Apr 16 14:20:41.929870 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:41.927708 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="main" containerID="cri-o://80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2" gracePeriod=30 Apr 16 14:20:41.929870 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:41.927754 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="tokenizer" containerID="cri-o://0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265" gracePeriod=30 Apr 16 14:20:42.212547 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:42.212465 2569 generic.go:358] "Generic (PLEG): container finished" podID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerID="80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2" exitCode=0 Apr 16 14:20:42.212687 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:42.212538 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerDied","Data":"80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2"} Apr 16 14:20:43.095838 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.095818 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:20:43.217562 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.217483 2569 generic.go:358] "Generic (PLEG): container finished" podID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerID="0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265" exitCode=0 Apr 16 14:20:43.217562 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.217553 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerDied","Data":"0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265"} Apr 16 14:20:43.217733 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.217564 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" Apr 16 14:20:43.217733 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.217587 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49" event={"ID":"17b6a48c-64ec-4d92-9c69-e876fc4df685","Type":"ContainerDied","Data":"7ff2a570e6bcf9a3153b929208f880ef8ba784dfb1cffe8f61cf5875ef216634"} Apr 16 14:20:43.217733 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.217609 2569 scope.go:117] "RemoveContainer" containerID="0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265" Apr 16 14:20:43.225234 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.225206 2569 scope.go:117] "RemoveContainer" containerID="80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2" Apr 16 14:20:43.232657 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.232637 2569 scope.go:117] "RemoveContainer" containerID="96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f" Apr 16 14:20:43.239484 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.239464 2569 scope.go:117] "RemoveContainer" containerID="0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265" Apr 16 14:20:43.239734 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:20:43.239714 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265\": container with ID starting with 0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265 not found: ID does not exist" containerID="0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265" Apr 16 14:20:43.239800 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.239748 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265"} err="failed to get container status \"0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265\": rpc error: code = NotFound desc = could not find container \"0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265\": container with ID starting with 0b95dc3f2463111953a2f529e500f64fcd95db1dbe66884f368ae0037af3a265 not found: ID does not exist" Apr 16 14:20:43.239800 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.239774 2569 scope.go:117] "RemoveContainer" containerID="80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2" Apr 16 14:20:43.242084 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:20:43.240322 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2\": container with ID starting with 80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2 not found: ID does not exist" containerID="80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2" Apr 16 14:20:43.242084 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.240353 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2"} err="failed to get container status \"80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2\": rpc error: code = NotFound desc = could not find container \"80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2\": container with ID starting with 80bacefb9ef576fcd21454b0b79936bfef9f7349d181b14a701da95c03d9f5b2 not found: ID does not exist" Apr 16 14:20:43.242084 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.240374 2569 scope.go:117] "RemoveContainer" containerID="96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f" Apr 16 14:20:43.242084 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:20:43.240628 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f\": container with ID starting with 96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f not found: ID does not exist" containerID="96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f" Apr 16 14:20:43.242084 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.240656 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f"} err="failed to get container status \"96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f\": rpc error: code = NotFound desc = could not find container \"96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f\": container with ID starting with 96365462b98805ec3e5d480fe9e68c63a19b776301b20cd8fe76b9099dc6816f not found: ID does not exist" Apr 16 14:20:43.246470 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246453 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/17b6a48c-64ec-4d92-9c69-e876fc4df685-tls-certs\") pod \"17b6a48c-64ec-4d92-9c69-e876fc4df685\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " Apr 16 14:20:43.246525 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246497 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-kserve-provision-location\") pod \"17b6a48c-64ec-4d92-9c69-e876fc4df685\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " Apr 16 14:20:43.246525 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246520 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-uds\") pod \"17b6a48c-64ec-4d92-9c69-e876fc4df685\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " Apr 16 14:20:43.246595 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246546 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-tmp\") pod \"17b6a48c-64ec-4d92-9c69-e876fc4df685\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " Apr 16 14:20:43.246595 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246565 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-bld6n\" (UniqueName: \"kubernetes.io/projected/17b6a48c-64ec-4d92-9c69-e876fc4df685-kube-api-access-bld6n\") pod \"17b6a48c-64ec-4d92-9c69-e876fc4df685\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " Apr 16 14:20:43.246688 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246603 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-cache\") pod \"17b6a48c-64ec-4d92-9c69-e876fc4df685\" (UID: \"17b6a48c-64ec-4d92-9c69-e876fc4df685\") " Apr 16 14:20:43.246837 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246811 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "17b6a48c-64ec-4d92-9c69-e876fc4df685" (UID: "17b6a48c-64ec-4d92-9c69-e876fc4df685"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:43.246941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246919 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "17b6a48c-64ec-4d92-9c69-e876fc4df685" (UID: "17b6a48c-64ec-4d92-9c69-e876fc4df685"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:43.246996 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.246933 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "17b6a48c-64ec-4d92-9c69-e876fc4df685" (UID: "17b6a48c-64ec-4d92-9c69-e876fc4df685"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:43.247323 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.247304 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "17b6a48c-64ec-4d92-9c69-e876fc4df685" (UID: "17b6a48c-64ec-4d92-9c69-e876fc4df685"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:20:43.248585 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.248557 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/17b6a48c-64ec-4d92-9c69-e876fc4df685-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "17b6a48c-64ec-4d92-9c69-e876fc4df685" (UID: "17b6a48c-64ec-4d92-9c69-e876fc4df685"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:20:43.248668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.248593 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/17b6a48c-64ec-4d92-9c69-e876fc4df685-kube-api-access-bld6n" (OuterVolumeSpecName: "kube-api-access-bld6n") pod "17b6a48c-64ec-4d92-9c69-e876fc4df685" (UID: "17b6a48c-64ec-4d92-9c69-e876fc4df685"). InnerVolumeSpecName "kube-api-access-bld6n". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:20:43.347792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.347754 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/17b6a48c-64ec-4d92-9c69-e876fc4df685-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:20:43.347792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.347790 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:20:43.347792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.347799 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:20:43.347792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.347809 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:20:43.348018 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.347818 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-bld6n\" (UniqueName: \"kubernetes.io/projected/17b6a48c-64ec-4d92-9c69-e876fc4df685-kube-api-access-bld6n\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:20:43.348018 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.347826 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/17b6a48c-64ec-4d92-9c69-e876fc4df685-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:20:43.540563 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.540539 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49"] Apr 16 14:20:43.544507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:43.544482 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-7cf75894jss49"] Apr 16 14:20:44.351655 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:44.351625 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" path="/var/lib/kubelet/pods/17b6a48c-64ec-4d92-9c69-e876fc4df685/volumes" Apr 16 14:20:47.756125 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756086 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs"] Apr 16 14:20:47.756547 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756531 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="main" Apr 16 14:20:47.756589 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756552 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="main" Apr 16 14:20:47.756589 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756576 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="storage-initializer" Apr 16 14:20:47.756589 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756584 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="storage-initializer" Apr 16 14:20:47.756696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756597 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="tokenizer" Apr 16 14:20:47.756696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756606 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="tokenizer" Apr 16 14:20:47.756696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756670 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="main" Apr 16 14:20:47.756696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.756682 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="17b6a48c-64ec-4d92-9c69-e876fc4df685" containerName="tokenizer" Apr 16 14:20:47.759975 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.759957 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.764505 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.764486 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:20:47.764747 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.764726 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 16 14:20:47.766035 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.766002 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:20:47.766155 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.766069 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-2cjwn\"" Apr 16 14:20:47.766234 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.766215 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:20:47.778329 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.778308 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.778425 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.778338 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.778425 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.778357 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.778425 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.778382 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ef85ce-f20b-436a-b266-0d611823716d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.778530 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.778451 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.778530 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.778481 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7rlxl\" (UniqueName: \"kubernetes.io/projected/e1ef85ce-f20b-436a-b266-0d611823716d-kube-api-access-7rlxl\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.782717 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.782699 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs"] Apr 16 14:20:47.879037 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879003 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7rlxl\" (UniqueName: \"kubernetes.io/projected/e1ef85ce-f20b-436a-b266-0d611823716d-kube-api-access-7rlxl\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879204 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879066 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879204 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879087 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879204 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879104 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879204 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879129 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ef85ce-f20b-436a-b266-0d611823716d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879204 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879157 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879604 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879575 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879604 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879603 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879766 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879674 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.879766 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.879716 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.881850 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.881830 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ef85ce-f20b-436a-b266-0d611823716d-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:47.893131 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:47.893106 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7rlxl\" (UniqueName: \"kubernetes.io/projected/e1ef85ce-f20b-436a-b266-0d611823716d-kube-api-access-7rlxl\") pod \"router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:48.069718 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:48.069628 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:48.193166 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:48.193113 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs"] Apr 16 14:20:48.196550 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:20:48.196508 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1ef85ce_f20b_436a_b266_0d611823716d.slice/crio-2d98932c6a553224f54bd0be7c0cadd26079888dd5b3249017f2c37b3b1e8a77 WatchSource:0}: Error finding container 2d98932c6a553224f54bd0be7c0cadd26079888dd5b3249017f2c37b3b1e8a77: Status 404 returned error can't find the container with id 2d98932c6a553224f54bd0be7c0cadd26079888dd5b3249017f2c37b3b1e8a77 Apr 16 14:20:48.237295 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:48.237256 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerStarted","Data":"2d98932c6a553224f54bd0be7c0cadd26079888dd5b3249017f2c37b3b1e8a77"} Apr 16 14:20:49.241868 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:49.241781 2569 generic.go:358] "Generic (PLEG): container finished" podID="e1ef85ce-f20b-436a-b266-0d611823716d" containerID="1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc" exitCode=0 Apr 16 14:20:49.241868 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:49.241846 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerDied","Data":"1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc"} Apr 16 14:20:50.252709 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:50.252666 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerStarted","Data":"a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80"} Apr 16 14:20:50.252709 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:50.252709 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerStarted","Data":"c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455"} Apr 16 14:20:50.253230 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:50.252807 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:50.289905 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:50.289861 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" podStartSLOduration=3.289847549 podStartE2EDuration="3.289847549s" podCreationTimestamp="2026-04-16 14:20:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:20:50.288832439 +0000 UTC m=+1290.482648364" watchObservedRunningTime="2026-04-16 14:20:50.289847549 +0000 UTC m=+1290.483663483" Apr 16 14:20:58.070547 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:58.070507 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:58.070547 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:58.070550 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:58.073302 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:58.073259 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:20:58.282519 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:20:58.282493 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:21:19.286494 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:21:19.286463 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:22:15.617015 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.616982 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85"] Apr 16 14:22:15.617617 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.617220 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" podUID="0aa370be-e90a-4ef6-b024-85ba8248f0b6" containerName="manager" containerID="cri-o://a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9" gracePeriod=30 Apr 16 14:22:15.860568 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.860545 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:22:15.959249 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.959161 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4k94x\" (UniqueName: \"kubernetes.io/projected/0aa370be-e90a-4ef6-b024-85ba8248f0b6-kube-api-access-4k94x\") pod \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " Apr 16 14:22:15.959249 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.959198 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert\") pod \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\" (UID: \"0aa370be-e90a-4ef6-b024-85ba8248f0b6\") " Apr 16 14:22:15.961334 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.961302 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert" (OuterVolumeSpecName: "cert") pod "0aa370be-e90a-4ef6-b024-85ba8248f0b6" (UID: "0aa370be-e90a-4ef6-b024-85ba8248f0b6"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:22:15.961445 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:15.961393 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0aa370be-e90a-4ef6-b024-85ba8248f0b6-kube-api-access-4k94x" (OuterVolumeSpecName: "kube-api-access-4k94x") pod "0aa370be-e90a-4ef6-b024-85ba8248f0b6" (UID: "0aa370be-e90a-4ef6-b024-85ba8248f0b6"). InnerVolumeSpecName "kube-api-access-4k94x". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:22:16.060750 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.060708 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4k94x\" (UniqueName: \"kubernetes.io/projected/0aa370be-e90a-4ef6-b024-85ba8248f0b6-kube-api-access-4k94x\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:16.060750 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.060741 2569 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0aa370be-e90a-4ef6-b024-85ba8248f0b6-cert\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:16.536508 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.536478 2569 generic.go:358] "Generic (PLEG): container finished" podID="0aa370be-e90a-4ef6-b024-85ba8248f0b6" containerID="a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9" exitCode=0 Apr 16 14:22:16.536684 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.536543 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" event={"ID":"0aa370be-e90a-4ef6-b024-85ba8248f0b6","Type":"ContainerDied","Data":"a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9"} Apr 16 14:22:16.536684 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.536552 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" Apr 16 14:22:16.536684 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.536576 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85" event={"ID":"0aa370be-e90a-4ef6-b024-85ba8248f0b6","Type":"ContainerDied","Data":"80ed9b23e3daf01f74470ea401c48297becd8d85874c2f96ffb037eb5da24ddc"} Apr 16 14:22:16.536684 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.536596 2569 scope.go:117] "RemoveContainer" containerID="a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9" Apr 16 14:22:16.544643 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.544296 2569 scope.go:117] "RemoveContainer" containerID="a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9" Apr 16 14:22:16.544643 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:22:16.544546 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9\": container with ID starting with a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9 not found: ID does not exist" containerID="a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9" Apr 16 14:22:16.544643 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.544571 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9"} err="failed to get container status \"a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9\": rpc error: code = NotFound desc = could not find container \"a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9\": container with ID starting with a0462e7dd9fde6f9ae2c99a3b58f6ec276a443b47fe66906340f17b7fd3792d9 not found: ID does not exist" Apr 16 14:22:16.577019 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.577001 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85"] Apr 16 14:22:16.581192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:16.581174 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-7ccc8fbdb4-t7z85"] Apr 16 14:22:18.347431 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:18.347389 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0aa370be-e90a-4ef6-b024-85ba8248f0b6" path="/var/lib/kubelet/pods/0aa370be-e90a-4ef6-b024-85ba8248f0b6/volumes" Apr 16 14:22:45.265456 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:45.265419 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs"] Apr 16 14:22:45.265996 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:45.265744 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="tokenizer" containerID="cri-o://a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80" gracePeriod=30 Apr 16 14:22:45.265996 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:45.265789 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="main" containerID="cri-o://c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455" gracePeriod=30 Apr 16 14:22:45.630102 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:45.630008 2569 generic.go:358] "Generic (PLEG): container finished" podID="e1ef85ce-f20b-436a-b266-0d611823716d" containerID="c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455" exitCode=0 Apr 16 14:22:45.630102 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:45.630082 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerDied","Data":"c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455"} Apr 16 14:22:46.516076 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.516053 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:22:46.588873 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.588802 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-kserve-provision-location\") pod \"e1ef85ce-f20b-436a-b266-0d611823716d\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " Apr 16 14:22:46.589016 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.588898 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-uds\") pod \"e1ef85ce-f20b-436a-b266-0d611823716d\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " Apr 16 14:22:46.589016 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.588950 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-cache\") pod \"e1ef85ce-f20b-436a-b266-0d611823716d\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " Apr 16 14:22:46.589016 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.588992 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7rlxl\" (UniqueName: \"kubernetes.io/projected/e1ef85ce-f20b-436a-b266-0d611823716d-kube-api-access-7rlxl\") pod \"e1ef85ce-f20b-436a-b266-0d611823716d\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " Apr 16 14:22:46.589178 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589049 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-tmp\") pod \"e1ef85ce-f20b-436a-b266-0d611823716d\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " Apr 16 14:22:46.589178 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589075 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ef85ce-f20b-436a-b266-0d611823716d-tls-certs\") pod \"e1ef85ce-f20b-436a-b266-0d611823716d\" (UID: \"e1ef85ce-f20b-436a-b266-0d611823716d\") " Apr 16 14:22:46.589178 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589162 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "e1ef85ce-f20b-436a-b266-0d611823716d" (UID: "e1ef85ce-f20b-436a-b266-0d611823716d"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:46.589353 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589291 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "e1ef85ce-f20b-436a-b266-0d611823716d" (UID: "e1ef85ce-f20b-436a-b266-0d611823716d"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:46.589414 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589367 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:46.589414 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589386 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:46.589499 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589406 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "e1ef85ce-f20b-436a-b266-0d611823716d" (UID: "e1ef85ce-f20b-436a-b266-0d611823716d"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:46.589675 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.589649 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "e1ef85ce-f20b-436a-b266-0d611823716d" (UID: "e1ef85ce-f20b-436a-b266-0d611823716d"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:22:46.591114 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.591088 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e1ef85ce-f20b-436a-b266-0d611823716d-kube-api-access-7rlxl" (OuterVolumeSpecName: "kube-api-access-7rlxl") pod "e1ef85ce-f20b-436a-b266-0d611823716d" (UID: "e1ef85ce-f20b-436a-b266-0d611823716d"). InnerVolumeSpecName "kube-api-access-7rlxl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:22:46.591225 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.591144 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e1ef85ce-f20b-436a-b266-0d611823716d-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "e1ef85ce-f20b-436a-b266-0d611823716d" (UID: "e1ef85ce-f20b-436a-b266-0d611823716d"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:22:46.635208 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.635173 2569 generic.go:358] "Generic (PLEG): container finished" podID="e1ef85ce-f20b-436a-b266-0d611823716d" containerID="a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80" exitCode=0 Apr 16 14:22:46.635388 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.635252 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" Apr 16 14:22:46.635388 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.635253 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerDied","Data":"a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80"} Apr 16 14:22:46.635388 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.635307 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs" event={"ID":"e1ef85ce-f20b-436a-b266-0d611823716d","Type":"ContainerDied","Data":"2d98932c6a553224f54bd0be7c0cadd26079888dd5b3249017f2c37b3b1e8a77"} Apr 16 14:22:46.635388 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.635323 2569 scope.go:117] "RemoveContainer" containerID="a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80" Apr 16 14:22:46.643413 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.643400 2569 scope.go:117] "RemoveContainer" containerID="c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455" Apr 16 14:22:46.650631 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.650612 2569 scope.go:117] "RemoveContainer" containerID="1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc" Apr 16 14:22:46.658496 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.658480 2569 scope.go:117] "RemoveContainer" containerID="a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80" Apr 16 14:22:46.658578 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.658519 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs"] Apr 16 14:22:46.658744 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:22:46.658724 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80\": container with ID starting with a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80 not found: ID does not exist" containerID="a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80" Apr 16 14:22:46.658792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.658755 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80"} err="failed to get container status \"a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80\": rpc error: code = NotFound desc = could not find container \"a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80\": container with ID starting with a7c2057693e05276885138bd850d1942f7bb055c059850c15ff8dfd16f252d80 not found: ID does not exist" Apr 16 14:22:46.658792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.658773 2569 scope.go:117] "RemoveContainer" containerID="c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455" Apr 16 14:22:46.658998 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:22:46.658980 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455\": container with ID starting with c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455 not found: ID does not exist" containerID="c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455" Apr 16 14:22:46.659042 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.659003 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455"} err="failed to get container status \"c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455\": rpc error: code = NotFound desc = could not find container \"c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455\": container with ID starting with c54ff47d619d3089edcbefcd3ef8a0c2d28f70dd7f7ea3a862dd976ad8eea455 not found: ID does not exist" Apr 16 14:22:46.659042 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.659019 2569 scope.go:117] "RemoveContainer" containerID="1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc" Apr 16 14:22:46.659280 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:22:46.659247 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc\": container with ID starting with 1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc not found: ID does not exist" containerID="1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc" Apr 16 14:22:46.659337 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.659299 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc"} err="failed to get container status \"1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc\": rpc error: code = NotFound desc = could not find container \"1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc\": container with ID starting with 1e243c1c0aa91b793cab26ca654ec902d1efbb5f007161c4f84fd2f9fb8600bc not found: ID does not exist" Apr 16 14:22:46.664696 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.664674 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5fd5fbcd55-7lvjs"] Apr 16 14:22:46.690088 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.690064 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7rlxl\" (UniqueName: \"kubernetes.io/projected/e1ef85ce-f20b-436a-b266-0d611823716d-kube-api-access-7rlxl\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:46.690088 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.690088 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:46.690245 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.690099 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/e1ef85ce-f20b-436a-b266-0d611823716d-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:46.690245 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:46.690108 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/e1ef85ce-f20b-436a-b266-0d611823716d-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:22:48.348151 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:22:48.348108 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" path="/var/lib/kubelet/pods/e1ef85ce-f20b-436a-b266-0d611823716d/volumes" Apr 16 14:23:01.462685 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462650 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn"] Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462950 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0aa370be-e90a-4ef6-b024-85ba8248f0b6" containerName="manager" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462960 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="0aa370be-e90a-4ef6-b024-85ba8248f0b6" containerName="manager" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462971 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="storage-initializer" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462977 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="storage-initializer" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462993 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="main" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.462998 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="main" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.463004 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="tokenizer" Apr 16 14:23:01.463051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.463009 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="tokenizer" Apr 16 14:23:01.463394 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.463057 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="main" Apr 16 14:23:01.463394 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.463064 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="e1ef85ce-f20b-436a-b266-0d611823716d" containerName="tokenizer" Apr 16 14:23:01.463394 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.463070 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="0aa370be-e90a-4ef6-b024-85ba8248f0b6" containerName="manager" Apr 16 14:23:01.467778 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.467756 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.470790 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.470766 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:23:01.471205 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.470876 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 16 14:23:01.471870 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.471847 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-2xdc2\"" Apr 16 14:23:01.471989 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.471933 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:23:01.472050 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.471997 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:23:01.481602 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.481582 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn"] Apr 16 14:23:01.615369 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.615332 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.615558 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.615395 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df0cdabc-a082-43c5-b7a3-3a831c89082b-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.615558 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.615429 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.615558 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.615457 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.615558 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.615496 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.615558 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.615534 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vd4j5\" (UniqueName: \"kubernetes.io/projected/df0cdabc-a082-43c5-b7a3-3a831c89082b-kube-api-access-vd4j5\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716228 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716136 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df0cdabc-a082-43c5-b7a3-3a831c89082b-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716228 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716176 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716228 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716197 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716518 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716232 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716518 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716260 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vd4j5\" (UniqueName: \"kubernetes.io/projected/df0cdabc-a082-43c5-b7a3-3a831c89082b-kube-api-access-vd4j5\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716518 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716307 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716674 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716608 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716674 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716639 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716748 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716675 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.716748 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.716700 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.718837 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.718818 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df0cdabc-a082-43c5-b7a3-3a831c89082b-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.727658 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.727636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vd4j5\" (UniqueName: \"kubernetes.io/projected/df0cdabc-a082-43c5-b7a3-3a831c89082b-kube-api-access-vd4j5\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.777701 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.777667 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:01.904317 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:01.904290 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn"] Apr 16 14:23:01.906914 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:23:01.906881 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddf0cdabc_a082_43c5_b7a3_3a831c89082b.slice/crio-4b47df22e279f007563d8d9d27d4f9b31186e646c48c40953ce09a2694a6afc4 WatchSource:0}: Error finding container 4b47df22e279f007563d8d9d27d4f9b31186e646c48c40953ce09a2694a6afc4: Status 404 returned error can't find the container with id 4b47df22e279f007563d8d9d27d4f9b31186e646c48c40953ce09a2694a6afc4 Apr 16 14:23:02.686281 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:02.686246 2569 generic.go:358] "Generic (PLEG): container finished" podID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerID="cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7" exitCode=0 Apr 16 14:23:02.686625 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:02.686300 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerDied","Data":"cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7"} Apr 16 14:23:02.686625 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:02.686339 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerStarted","Data":"4b47df22e279f007563d8d9d27d4f9b31186e646c48c40953ce09a2694a6afc4"} Apr 16 14:23:03.691815 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:03.691782 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerStarted","Data":"0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb"} Apr 16 14:23:03.691815 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:03.691820 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerStarted","Data":"85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103"} Apr 16 14:23:03.692244 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:03.691943 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:03.715776 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:03.715730 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" podStartSLOduration=2.71571506 podStartE2EDuration="2.71571506s" podCreationTimestamp="2026-04-16 14:23:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:23:03.713811532 +0000 UTC m=+1423.907627478" watchObservedRunningTime="2026-04-16 14:23:03.71571506 +0000 UTC m=+1423.909530994" Apr 16 14:23:11.777919 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:11.777835 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:11.777919 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:11.777874 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:11.780790 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:11.780766 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:12.721732 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:12.721703 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:23:33.726976 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:23:33.726944 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:26:34.005263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:34.005185 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn"] Apr 16 14:26:34.005854 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:34.005508 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="main" containerID="cri-o://85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103" gracePeriod=30 Apr 16 14:26:34.005854 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:34.005541 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="tokenizer" containerID="cri-o://0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb" gracePeriod=30 Apr 16 14:26:34.368739 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:34.368647 2569 generic.go:358] "Generic (PLEG): container finished" podID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerID="85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103" exitCode=0 Apr 16 14:26:34.368739 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:34.368723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerDied","Data":"85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103"} Apr 16 14:26:35.157337 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.157312 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:26:35.224633 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224605 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vd4j5\" (UniqueName: \"kubernetes.io/projected/df0cdabc-a082-43c5-b7a3-3a831c89082b-kube-api-access-vd4j5\") pod \"df0cdabc-a082-43c5-b7a3-3a831c89082b\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " Apr 16 14:26:35.224633 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224636 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df0cdabc-a082-43c5-b7a3-3a831c89082b-tls-certs\") pod \"df0cdabc-a082-43c5-b7a3-3a831c89082b\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " Apr 16 14:26:35.224925 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224674 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-uds\") pod \"df0cdabc-a082-43c5-b7a3-3a831c89082b\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " Apr 16 14:26:35.224925 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224717 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-kserve-provision-location\") pod \"df0cdabc-a082-43c5-b7a3-3a831c89082b\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " Apr 16 14:26:35.224925 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224746 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-tmp\") pod \"df0cdabc-a082-43c5-b7a3-3a831c89082b\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " Apr 16 14:26:35.224925 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224811 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-cache\") pod \"df0cdabc-a082-43c5-b7a3-3a831c89082b\" (UID: \"df0cdabc-a082-43c5-b7a3-3a831c89082b\") " Apr 16 14:26:35.225137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.224994 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "df0cdabc-a082-43c5-b7a3-3a831c89082b" (UID: "df0cdabc-a082-43c5-b7a3-3a831c89082b"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:35.225182 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.225132 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "df0cdabc-a082-43c5-b7a3-3a831c89082b" (UID: "df0cdabc-a082-43c5-b7a3-3a831c89082b"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:35.225182 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.225147 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "df0cdabc-a082-43c5-b7a3-3a831c89082b" (UID: "df0cdabc-a082-43c5-b7a3-3a831c89082b"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:35.225501 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.225480 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "df0cdabc-a082-43c5-b7a3-3a831c89082b" (UID: "df0cdabc-a082-43c5-b7a3-3a831c89082b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:26:35.226775 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.226751 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/df0cdabc-a082-43c5-b7a3-3a831c89082b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "df0cdabc-a082-43c5-b7a3-3a831c89082b" (UID: "df0cdabc-a082-43c5-b7a3-3a831c89082b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:26:35.226868 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.226812 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/df0cdabc-a082-43c5-b7a3-3a831c89082b-kube-api-access-vd4j5" (OuterVolumeSpecName: "kube-api-access-vd4j5") pod "df0cdabc-a082-43c5-b7a3-3a831c89082b" (UID: "df0cdabc-a082-43c5-b7a3-3a831c89082b"). InnerVolumeSpecName "kube-api-access-vd4j5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:26:35.326191 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.326159 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vd4j5\" (UniqueName: \"kubernetes.io/projected/df0cdabc-a082-43c5-b7a3-3a831c89082b-kube-api-access-vd4j5\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.326191 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.326187 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/df0cdabc-a082-43c5-b7a3-3a831c89082b-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.326191 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.326198 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.326444 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.326206 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.326444 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.326216 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.326444 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.326225 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/df0cdabc-a082-43c5-b7a3-3a831c89082b-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:26:35.374522 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.374489 2569 generic.go:358] "Generic (PLEG): container finished" podID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerID="0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb" exitCode=0 Apr 16 14:26:35.374662 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.374563 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerDied","Data":"0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb"} Apr 16 14:26:35.374662 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.374580 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" Apr 16 14:26:35.374662 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.374596 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn" event={"ID":"df0cdabc-a082-43c5-b7a3-3a831c89082b","Type":"ContainerDied","Data":"4b47df22e279f007563d8d9d27d4f9b31186e646c48c40953ce09a2694a6afc4"} Apr 16 14:26:35.374662 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.374616 2569 scope.go:117] "RemoveContainer" containerID="0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb" Apr 16 14:26:35.382972 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.382953 2569 scope.go:117] "RemoveContainer" containerID="85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103" Apr 16 14:26:35.389702 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.389686 2569 scope.go:117] "RemoveContainer" containerID="cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7" Apr 16 14:26:35.395553 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.395531 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn"] Apr 16 14:26:35.396910 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.396891 2569 scope.go:117] "RemoveContainer" containerID="0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb" Apr 16 14:26:35.397154 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:26:35.397136 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb\": container with ID starting with 0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb not found: ID does not exist" containerID="0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb" Apr 16 14:26:35.397223 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.397197 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb"} err="failed to get container status \"0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb\": rpc error: code = NotFound desc = could not find container \"0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb\": container with ID starting with 0e23e8ff796be6f7bcef35a192951608e14ac1b88b74631574297701c4951cbb not found: ID does not exist" Apr 16 14:26:35.397345 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.397223 2569 scope.go:117] "RemoveContainer" containerID="85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103" Apr 16 14:26:35.397609 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:26:35.397580 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103\": container with ID starting with 85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103 not found: ID does not exist" containerID="85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103" Apr 16 14:26:35.397758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.397619 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103"} err="failed to get container status \"85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103\": rpc error: code = NotFound desc = could not find container \"85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103\": container with ID starting with 85338a0527c493fa7c4ffb44ab9b9cf59938ef721209d7c7178f738513a78103 not found: ID does not exist" Apr 16 14:26:35.397758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.397642 2569 scope.go:117] "RemoveContainer" containerID="cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7" Apr 16 14:26:35.397946 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:26:35.397925 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7\": container with ID starting with cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7 not found: ID does not exist" containerID="cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7" Apr 16 14:26:35.398003 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.397954 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7"} err="failed to get container status \"cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7\": rpc error: code = NotFound desc = could not find container \"cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7\": container with ID starting with cee4c31d3898ff349cb3f2da90cc5733937ceec330b006b9b04f60827870fdb7 not found: ID does not exist" Apr 16 14:26:35.399953 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:35.399934 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-schenr9tn"] Apr 16 14:26:36.348102 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:36.348070 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" path="/var/lib/kubelet/pods/df0cdabc-a082-43c5-b7a3-3a831c89082b/volumes" Apr 16 14:26:54.223384 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223350 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc"] Apr 16 14:26:54.223758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223674 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="tokenizer" Apr 16 14:26:54.223758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223686 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="tokenizer" Apr 16 14:26:54.223758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223698 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="storage-initializer" Apr 16 14:26:54.223758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223704 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="storage-initializer" Apr 16 14:26:54.223758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223713 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="main" Apr 16 14:26:54.223758 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223721 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="main" Apr 16 14:26:54.223977 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223786 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="tokenizer" Apr 16 14:26:54.223977 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.223798 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="df0cdabc-a082-43c5-b7a3-3a831c89082b" containerName="main" Apr 16 14:26:54.226732 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.226713 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.229713 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.229692 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-9x25x\"" Apr 16 14:26:54.229850 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.229818 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 16 14:26:54.229917 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.229881 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 16 14:26:54.230918 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.230900 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 16 14:26:54.231000 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.230944 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-xs8cb\"" Apr 16 14:26:54.238417 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.238400 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc"] Apr 16 14:26:54.379363 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.379329 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.379528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.379381 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.379528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.379453 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/143e316d-4237-492c-845d-7ec55ca0b85f-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.379528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.379516 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6tbsv\" (UniqueName: \"kubernetes.io/projected/143e316d-4237-492c-845d-7ec55ca0b85f-kube-api-access-6tbsv\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.379657 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.379582 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.379657 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.379640 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481077 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.480989 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481077 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481037 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/143e316d-4237-492c-845d-7ec55ca0b85f-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6tbsv\" (UniqueName: \"kubernetes.io/projected/143e316d-4237-492c-845d-7ec55ca0b85f-kube-api-access-6tbsv\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481340 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481261 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481492 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481346 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481492 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481391 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481492 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481448 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481653 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481611 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.481709 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.481673 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.483599 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.483578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/143e316d-4237-492c-845d-7ec55ca0b85f-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.491522 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.491499 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6tbsv\" (UniqueName: \"kubernetes.io/projected/143e316d-4237-492c-845d-7ec55ca0b85f-kube-api-access-6tbsv\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.535772 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.535744 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:54.684203 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.684161 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc"] Apr 16 14:26:54.687019 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:26:54.686993 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod143e316d_4237_492c_845d_7ec55ca0b85f.slice/crio-c25df8a304b89aee4f30ccae06d9a880fb77ef416e9a2fab71c60eae4f13b4ef WatchSource:0}: Error finding container c25df8a304b89aee4f30ccae06d9a880fb77ef416e9a2fab71c60eae4f13b4ef: Status 404 returned error can't find the container with id c25df8a304b89aee4f30ccae06d9a880fb77ef416e9a2fab71c60eae4f13b4ef Apr 16 14:26:54.688921 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:54.688902 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:26:55.435983 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:55.435951 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerStarted","Data":"b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46"} Apr 16 14:26:55.436315 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:55.435990 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerStarted","Data":"c25df8a304b89aee4f30ccae06d9a880fb77ef416e9a2fab71c60eae4f13b4ef"} Apr 16 14:26:56.440587 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:56.440548 2569 generic.go:358] "Generic (PLEG): container finished" podID="143e316d-4237-492c-845d-7ec55ca0b85f" containerID="b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46" exitCode=0 Apr 16 14:26:56.440955 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:56.440584 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerDied","Data":"b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46"} Apr 16 14:26:56.440955 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:56.440618 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerStarted","Data":"707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753"} Apr 16 14:26:56.440955 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:56.440627 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerStarted","Data":"6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b"} Apr 16 14:26:56.440955 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:56.440723 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:26:56.464437 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:26:56.464395 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" podStartSLOduration=2.464382564 podStartE2EDuration="2.464382564s" podCreationTimestamp="2026-04-16 14:26:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:26:56.462765038 +0000 UTC m=+1656.656580990" watchObservedRunningTime="2026-04-16 14:26:56.464382564 +0000 UTC m=+1656.658198499" Apr 16 14:27:04.536950 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:04.536910 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:27:04.536950 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:04.536952 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:27:04.539632 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:04.539606 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:27:05.471585 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:05.471550 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:27:07.736572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.736541 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:27:07.740128 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.740099 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.743513 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.743487 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 16 14:27:07.744833 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.744818 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-xjkzv\"" Apr 16 14:27:07.758005 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.757981 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:27:07.800006 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.799977 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.800006 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.800008 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.800207 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.800031 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.800207 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.800127 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xr9nc\" (UniqueName: \"kubernetes.io/projected/b38e263e-c344-4983-8433-d6eecae467a2-kube-api-access-xr9nc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.800207 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.800181 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.800337 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.800225 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b38e263e-c344-4983-8433-d6eecae467a2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901205 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901168 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901226 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xr9nc\" (UniqueName: \"kubernetes.io/projected/b38e263e-c344-4983-8433-d6eecae467a2-kube-api-access-xr9nc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901289 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901336 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b38e263e-c344-4983-8433-d6eecae467a2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901364 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901387 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901386 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901700 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901672 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901851 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901832 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.901905 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.901871 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.904051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.904026 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.904241 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.904223 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b38e263e-c344-4983-8433-d6eecae467a2-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:07.909923 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:07.909903 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xr9nc\" (UniqueName: \"kubernetes.io/projected/b38e263e-c344-4983-8433-d6eecae467a2-kube-api-access-xr9nc\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:08.050217 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:08.050185 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:27:08.178113 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:08.178088 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:27:08.180451 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:27:08.180425 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb38e263e_c344_4983_8433_d6eecae467a2.slice/crio-be361e0034345bc4e1c74fe72a806b6cade8358bcc2b9be09ab954409d29c416 WatchSource:0}: Error finding container be361e0034345bc4e1c74fe72a806b6cade8358bcc2b9be09ab954409d29c416: Status 404 returned error can't find the container with id be361e0034345bc4e1c74fe72a806b6cade8358bcc2b9be09ab954409d29c416 Apr 16 14:27:08.482594 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:08.482500 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b38e263e-c344-4983-8433-d6eecae467a2","Type":"ContainerStarted","Data":"acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512"} Apr 16 14:27:08.482594 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:08.482545 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b38e263e-c344-4983-8433-d6eecae467a2","Type":"ContainerStarted","Data":"be361e0034345bc4e1c74fe72a806b6cade8358bcc2b9be09ab954409d29c416"} Apr 16 14:27:12.498154 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:12.498121 2569 generic.go:358] "Generic (PLEG): container finished" podID="b38e263e-c344-4983-8433-d6eecae467a2" containerID="acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512" exitCode=0 Apr 16 14:27:12.498561 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:12.498167 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b38e263e-c344-4983-8433-d6eecae467a2","Type":"ContainerDied","Data":"acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512"} Apr 16 14:27:26.476296 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:27:26.476251 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:28:00.668801 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:28:00.668762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b38e263e-c344-4983-8433-d6eecae467a2","Type":"ContainerStarted","Data":"11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2"} Apr 16 14:28:00.688990 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:28:00.688940 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=6.094194098 podStartE2EDuration="53.688926261s" podCreationTimestamp="2026-04-16 14:27:07 +0000 UTC" firstStartedPulling="2026-04-16 14:27:12.499362441 +0000 UTC m=+1672.693178354" lastFinishedPulling="2026-04-16 14:28:00.0940946 +0000 UTC m=+1720.287910517" observedRunningTime="2026-04-16 14:28:00.687173667 +0000 UTC m=+1720.880989599" watchObservedRunningTime="2026-04-16 14:28:00.688926261 +0000 UTC m=+1720.882742195" Apr 16 14:30:11.020096 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.020053 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:30:11.020712 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.020460 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="b38e263e-c344-4983-8433-d6eecae467a2" containerName="main" containerID="cri-o://11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2" gracePeriod=30 Apr 16 14:30:11.884247 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.884223 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:30:11.926772 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.926742 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-dshm\") pod \"b38e263e-c344-4983-8433-d6eecae467a2\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " Apr 16 14:30:11.926973 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.926801 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xr9nc\" (UniqueName: \"kubernetes.io/projected/b38e263e-c344-4983-8433-d6eecae467a2-kube-api-access-xr9nc\") pod \"b38e263e-c344-4983-8433-d6eecae467a2\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " Apr 16 14:30:11.926973 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.926849 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-kserve-provision-location\") pod \"b38e263e-c344-4983-8433-d6eecae467a2\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " Apr 16 14:30:11.926973 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.926882 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-home\") pod \"b38e263e-c344-4983-8433-d6eecae467a2\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " Apr 16 14:30:11.926973 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.926914 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-model-cache\") pod \"b38e263e-c344-4983-8433-d6eecae467a2\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " Apr 16 14:30:11.926973 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.926939 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b38e263e-c344-4983-8433-d6eecae467a2-tls-certs\") pod \"b38e263e-c344-4983-8433-d6eecae467a2\" (UID: \"b38e263e-c344-4983-8433-d6eecae467a2\") " Apr 16 14:30:11.928337 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.928305 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-model-cache" (OuterVolumeSpecName: "model-cache") pod "b38e263e-c344-4983-8433-d6eecae467a2" (UID: "b38e263e-c344-4983-8433-d6eecae467a2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:11.928509 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.928482 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-home" (OuterVolumeSpecName: "home") pod "b38e263e-c344-4983-8433-d6eecae467a2" (UID: "b38e263e-c344-4983-8433-d6eecae467a2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:11.929824 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.929660 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b38e263e-c344-4983-8433-d6eecae467a2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b38e263e-c344-4983-8433-d6eecae467a2" (UID: "b38e263e-c344-4983-8433-d6eecae467a2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:30:11.929824 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.929723 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-dshm" (OuterVolumeSpecName: "dshm") pod "b38e263e-c344-4983-8433-d6eecae467a2" (UID: "b38e263e-c344-4983-8433-d6eecae467a2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:11.929824 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.929783 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b38e263e-c344-4983-8433-d6eecae467a2-kube-api-access-xr9nc" (OuterVolumeSpecName: "kube-api-access-xr9nc") pod "b38e263e-c344-4983-8433-d6eecae467a2" (UID: "b38e263e-c344-4983-8433-d6eecae467a2"). InnerVolumeSpecName "kube-api-access-xr9nc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:30:11.988555 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:11.988494 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b38e263e-c344-4983-8433-d6eecae467a2" (UID: "b38e263e-c344-4983-8433-d6eecae467a2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:12.028461 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.028422 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:12.028461 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.028456 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-home\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:12.028461 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.028470 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-model-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:12.028951 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.028484 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b38e263e-c344-4983-8433-d6eecae467a2-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:12.028951 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.028495 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b38e263e-c344-4983-8433-d6eecae467a2-dshm\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:12.028951 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.028507 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xr9nc\" (UniqueName: \"kubernetes.io/projected/b38e263e-c344-4983-8433-d6eecae467a2-kube-api-access-xr9nc\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:12.104687 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.104650 2569 generic.go:358] "Generic (PLEG): container finished" podID="b38e263e-c344-4983-8433-d6eecae467a2" containerID="11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2" exitCode=0 Apr 16 14:30:12.104857 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.104723 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 16 14:30:12.104857 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.104723 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b38e263e-c344-4983-8433-d6eecae467a2","Type":"ContainerDied","Data":"11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2"} Apr 16 14:30:12.104857 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.104762 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"b38e263e-c344-4983-8433-d6eecae467a2","Type":"ContainerDied","Data":"be361e0034345bc4e1c74fe72a806b6cade8358bcc2b9be09ab954409d29c416"} Apr 16 14:30:12.104857 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.104777 2569 scope.go:117] "RemoveContainer" containerID="11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2" Apr 16 14:30:12.125460 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.125203 2569 scope.go:117] "RemoveContainer" containerID="acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512" Apr 16 14:30:12.138150 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.138122 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:30:12.139158 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.139106 2569 scope.go:117] "RemoveContainer" containerID="11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2" Apr 16 14:30:12.139474 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:30:12.139446 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2\": container with ID starting with 11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2 not found: ID does not exist" containerID="11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2" Apr 16 14:30:12.139569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.139485 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2"} err="failed to get container status \"11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2\": rpc error: code = NotFound desc = could not find container \"11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2\": container with ID starting with 11990bf6248f5e83edc602d537bf3e06b732b4221f7c9be1b9d34b47a25fe8a2 not found: ID does not exist" Apr 16 14:30:12.139569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.139511 2569 scope.go:117] "RemoveContainer" containerID="acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512" Apr 16 14:30:12.139752 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:30:12.139738 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512\": container with ID starting with acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512 not found: ID does not exist" containerID="acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512" Apr 16 14:30:12.139811 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.139756 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512"} err="failed to get container status \"acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512\": rpc error: code = NotFound desc = could not find container \"acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512\": container with ID starting with acaf2f5580fef4fd364ff3b39c199bffc95426d64d910d704107fc7d60b01512 not found: ID does not exist" Apr 16 14:30:12.145134 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.145111 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 16 14:30:12.349490 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:12.349459 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b38e263e-c344-4983-8433-d6eecae467a2" path="/var/lib/kubelet/pods/b38e263e-c344-4983-8433-d6eecae467a2/volumes" Apr 16 14:30:29.187082 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.187051 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m"] Apr 16 14:30:29.187598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.187508 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b38e263e-c344-4983-8433-d6eecae467a2" containerName="storage-initializer" Apr 16 14:30:29.187598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.187526 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38e263e-c344-4983-8433-d6eecae467a2" containerName="storage-initializer" Apr 16 14:30:29.187598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.187545 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b38e263e-c344-4983-8433-d6eecae467a2" containerName="main" Apr 16 14:30:29.187598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.187554 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="b38e263e-c344-4983-8433-d6eecae467a2" containerName="main" Apr 16 14:30:29.187806 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.187638 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="b38e263e-c344-4983-8433-d6eecae467a2" containerName="main" Apr 16 14:30:29.190611 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.190590 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.196824 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.196794 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 16 14:30:29.198005 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.197986 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-tblkt\"" Apr 16 14:30:29.216422 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.216397 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m"] Apr 16 14:30:29.276526 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.276486 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.276716 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.276545 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pdvnp\" (UniqueName: \"kubernetes.io/projected/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kube-api-access-pdvnp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.276716 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.276608 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.276716 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.276652 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.276716 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.276687 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.276885 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.276741 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378133 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378098 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pdvnp\" (UniqueName: \"kubernetes.io/projected/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kube-api-access-pdvnp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378147 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378166 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378199 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378221 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378292 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378595 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378578 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378632 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378600 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378672 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378630 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.378706 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.378664 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.380839 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.380816 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.388003 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.387973 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pdvnp\" (UniqueName: \"kubernetes.io/projected/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kube-api-access-pdvnp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.500035 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.499961 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:29.630510 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:29.630488 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m"] Apr 16 14:30:29.632575 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:30:29.632546 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod30b7247a_12d1_4724_90f2_08cb0e52bcbd.slice/crio-b6c707ecb3033a093f0bf83b7d452ca31cad6e060a72425c64a686e225160e31 WatchSource:0}: Error finding container b6c707ecb3033a093f0bf83b7d452ca31cad6e060a72425c64a686e225160e31: Status 404 returned error can't find the container with id b6c707ecb3033a093f0bf83b7d452ca31cad6e060a72425c64a686e225160e31 Apr 16 14:30:30.166188 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:30.166142 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerStarted","Data":"bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994"} Apr 16 14:30:30.166408 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:30.166202 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerStarted","Data":"b6c707ecb3033a093f0bf83b7d452ca31cad6e060a72425c64a686e225160e31"} Apr 16 14:30:31.171827 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:31.171785 2569 generic.go:358] "Generic (PLEG): container finished" podID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerID="bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994" exitCode=0 Apr 16 14:30:31.172334 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:31.171861 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerDied","Data":"bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994"} Apr 16 14:30:31.202095 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:31.202069 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc"] Apr 16 14:30:31.202402 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:31.202378 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="main" containerID="cri-o://6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b" gracePeriod=30 Apr 16 14:30:31.202484 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:31.202427 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="tokenizer" containerID="cri-o://707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753" gracePeriod=30 Apr 16 14:30:32.177431 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.177397 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerStarted","Data":"cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f"} Apr 16 14:30:32.177431 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.177435 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerStarted","Data":"ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48"} Apr 16 14:30:32.177849 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.177529 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:32.179801 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.179778 2569 generic.go:358] "Generic (PLEG): container finished" podID="143e316d-4237-492c-845d-7ec55ca0b85f" containerID="6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b" exitCode=0 Apr 16 14:30:32.179929 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.179814 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerDied","Data":"6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b"} Apr 16 14:30:32.200470 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.200414 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" podStartSLOduration=3.200394275 podStartE2EDuration="3.200394275s" podCreationTimestamp="2026-04-16 14:30:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:32.200066033 +0000 UTC m=+1872.393881968" watchObservedRunningTime="2026-04-16 14:30:32.200394275 +0000 UTC m=+1872.394210212" Apr 16 14:30:32.448041 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.448020 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:30:32.604358 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604321 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-cache\") pod \"143e316d-4237-492c-845d-7ec55ca0b85f\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " Apr 16 14:30:32.604539 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604390 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6tbsv\" (UniqueName: \"kubernetes.io/projected/143e316d-4237-492c-845d-7ec55ca0b85f-kube-api-access-6tbsv\") pod \"143e316d-4237-492c-845d-7ec55ca0b85f\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " Apr 16 14:30:32.604539 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604414 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-tmp\") pod \"143e316d-4237-492c-845d-7ec55ca0b85f\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " Apr 16 14:30:32.604539 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604479 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-kserve-provision-location\") pod \"143e316d-4237-492c-845d-7ec55ca0b85f\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " Apr 16 14:30:32.604539 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604509 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-uds\") pod \"143e316d-4237-492c-845d-7ec55ca0b85f\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " Apr 16 14:30:32.604539 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604538 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/143e316d-4237-492c-845d-7ec55ca0b85f-tls-certs\") pod \"143e316d-4237-492c-845d-7ec55ca0b85f\" (UID: \"143e316d-4237-492c-845d-7ec55ca0b85f\") " Apr 16 14:30:32.604793 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604604 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "143e316d-4237-492c-845d-7ec55ca0b85f" (UID: "143e316d-4237-492c-845d-7ec55ca0b85f"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:32.604850 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604817 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:32.604902 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.604838 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "143e316d-4237-492c-845d-7ec55ca0b85f" (UID: "143e316d-4237-492c-845d-7ec55ca0b85f"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:32.609381 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.609353 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "143e316d-4237-492c-845d-7ec55ca0b85f" (UID: "143e316d-4237-492c-845d-7ec55ca0b85f"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:32.612301 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.609573 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "143e316d-4237-492c-845d-7ec55ca0b85f" (UID: "143e316d-4237-492c-845d-7ec55ca0b85f"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:30:32.616409 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.616333 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/143e316d-4237-492c-845d-7ec55ca0b85f-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "143e316d-4237-492c-845d-7ec55ca0b85f" (UID: "143e316d-4237-492c-845d-7ec55ca0b85f"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:30:32.622279 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.621028 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/143e316d-4237-492c-845d-7ec55ca0b85f-kube-api-access-6tbsv" (OuterVolumeSpecName: "kube-api-access-6tbsv") pod "143e316d-4237-492c-845d-7ec55ca0b85f" (UID: "143e316d-4237-492c-845d-7ec55ca0b85f"). InnerVolumeSpecName "kube-api-access-6tbsv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:30:32.705634 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.705556 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:32.705634 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.705591 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:32.705634 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.705607 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/143e316d-4237-492c-845d-7ec55ca0b85f-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:32.705634 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.705620 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6tbsv\" (UniqueName: \"kubernetes.io/projected/143e316d-4237-492c-845d-7ec55ca0b85f-kube-api-access-6tbsv\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:32.705634 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:32.705635 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/143e316d-4237-492c-845d-7ec55ca0b85f-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:30:33.185208 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.185166 2569 generic.go:358] "Generic (PLEG): container finished" podID="143e316d-4237-492c-845d-7ec55ca0b85f" containerID="707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753" exitCode=0 Apr 16 14:30:33.185678 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.185257 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerDied","Data":"707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753"} Apr 16 14:30:33.185678 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.185292 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" Apr 16 14:30:33.185678 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.185317 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc" event={"ID":"143e316d-4237-492c-845d-7ec55ca0b85f","Type":"ContainerDied","Data":"c25df8a304b89aee4f30ccae06d9a880fb77ef416e9a2fab71c60eae4f13b4ef"} Apr 16 14:30:33.185678 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.185338 2569 scope.go:117] "RemoveContainer" containerID="707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753" Apr 16 14:30:33.194049 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.194029 2569 scope.go:117] "RemoveContainer" containerID="6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b" Apr 16 14:30:33.202323 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.202307 2569 scope.go:117] "RemoveContainer" containerID="b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46" Apr 16 14:30:33.208584 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.208560 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc"] Apr 16 14:30:33.210289 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.210252 2569 scope.go:117] "RemoveContainer" containerID="707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753" Apr 16 14:30:33.211145 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:30:33.211114 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753\": container with ID starting with 707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753 not found: ID does not exist" containerID="707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753" Apr 16 14:30:33.211243 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.211156 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753"} err="failed to get container status \"707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753\": rpc error: code = NotFound desc = could not find container \"707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753\": container with ID starting with 707d6fd354b33df9316362573fc357773ee787bbc6c2bd5abac2bd79ec573753 not found: ID does not exist" Apr 16 14:30:33.211243 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.211182 2569 scope.go:117] "RemoveContainer" containerID="6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b" Apr 16 14:30:33.211528 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:30:33.211494 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b\": container with ID starting with 6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b not found: ID does not exist" containerID="6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b" Apr 16 14:30:33.211598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.211538 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b"} err="failed to get container status \"6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b\": rpc error: code = NotFound desc = could not find container \"6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b\": container with ID starting with 6097220b15559a7dbc4b928f09d6950bcbec2c6891eee3ecaf87376d4b7d7e9b not found: ID does not exist" Apr 16 14:30:33.211598 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.211560 2569 scope.go:117] "RemoveContainer" containerID="b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46" Apr 16 14:30:33.211858 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:30:33.211834 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46\": container with ID starting with b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46 not found: ID does not exist" containerID="b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46" Apr 16 14:30:33.211962 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.211865 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46"} err="failed to get container status \"b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46\": rpc error: code = NotFound desc = could not find container \"b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46\": container with ID starting with b3eec16d563fe90e0a742d5cab80dfc250184ca46a2aa5fbc69721f7b31b2d46 not found: ID does not exist" Apr 16 14:30:33.212371 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:33.212353 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-5ddc69fnbc"] Apr 16 14:30:34.349296 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:34.349248 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" path="/var/lib/kubelet/pods/143e316d-4237-492c-845d-7ec55ca0b85f/volumes" Apr 16 14:30:39.500316 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:39.500205 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:39.500316 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:39.500250 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:39.502929 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:39.502906 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:40.210220 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:40.210188 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:30:46.978909 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.978871 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj"] Apr 16 14:30:46.979398 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979297 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="tokenizer" Apr 16 14:30:46.979398 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979315 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="tokenizer" Apr 16 14:30:46.979398 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979343 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="main" Apr 16 14:30:46.979398 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979351 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="main" Apr 16 14:30:46.979398 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979360 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="storage-initializer" Apr 16 14:30:46.979398 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979369 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="storage-initializer" Apr 16 14:30:46.979670 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979431 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="tokenizer" Apr 16 14:30:46.979670 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.979442 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="143e316d-4237-492c-845d-7ec55ca0b85f" containerName="main" Apr 16 14:30:46.984532 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.984512 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:46.987515 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.987495 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-kkwrg\"" Apr 16 14:30:46.987662 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.987645 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 16 14:30:46.994618 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.994594 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj"] Apr 16 14:30:46.997845 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:46.997823 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7"] Apr 16 14:30:47.001053 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.001034 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.022284 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.022246 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7"] Apr 16 14:30:47.117968 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.117937 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-home\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.118137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.117982 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01da8c6e-9c53-46bc-a992-127910e8de45-tls-certs\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.118137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118040 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.118137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118076 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.118137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118106 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118147 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thkjb\" (UniqueName: \"kubernetes.io/projected/01da8c6e-9c53-46bc-a992-127910e8de45-kube-api-access-thkjb\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118186 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-model-cache\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118203 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118223 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xv4mw\" (UniqueName: \"kubernetes.io/projected/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kube-api-access-xv4mw\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118239 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118254 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-dshm\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.118332 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.118303 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-home\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.219571 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219540 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thkjb\" (UniqueName: \"kubernetes.io/projected/01da8c6e-9c53-46bc-a992-127910e8de45-kube-api-access-thkjb\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.219773 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219585 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-model-cache\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.219773 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219607 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.219773 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219628 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xv4mw\" (UniqueName: \"kubernetes.io/projected/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kube-api-access-xv4mw\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.219773 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219656 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.219992 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219825 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-dshm\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.219992 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.219892 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-home\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.220126 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220099 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-model-cache\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.220192 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220171 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-home\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.220263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220197 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-home\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.220263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220254 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01da8c6e-9c53-46bc-a992-127910e8de45-tls-certs\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.220263 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220178 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.220429 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220310 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.220429 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220341 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.220429 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220383 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.220578 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220442 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-home\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.220660 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220637 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.220721 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.220698 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.222338 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.222311 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-dshm\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.222654 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.222636 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.222750 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.222732 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.222807 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.222738 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01da8c6e-9c53-46bc-a992-127910e8de45-tls-certs\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.227640 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.227614 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xv4mw\" (UniqueName: \"kubernetes.io/projected/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kube-api-access-xv4mw\") pod \"router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.228447 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.228426 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thkjb\" (UniqueName: \"kubernetes.io/projected/01da8c6e-9c53-46bc-a992-127910e8de45-kube-api-access-thkjb\") pod \"router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.295114 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.294943 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:47.311062 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.311035 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:47.448059 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.448031 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj"] Apr 16 14:30:47.451210 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:30:47.451183 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod01da8c6e_9c53_46bc_a992_127910e8de45.slice/crio-2f94c0c0ba1139f9d24280999c829172546c9adb84c49bd1ddf3c8b4c4f0ecd8 WatchSource:0}: Error finding container 2f94c0c0ba1139f9d24280999c829172546c9adb84c49bd1ddf3c8b4c4f0ecd8: Status 404 returned error can't find the container with id 2f94c0c0ba1139f9d24280999c829172546c9adb84c49bd1ddf3c8b4c4f0ecd8 Apr 16 14:30:47.467319 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:47.467252 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7"] Apr 16 14:30:47.469616 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:30:47.469592 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf37ed6b5_8349_4076_8cc9_e35c3644b2a3.slice/crio-1ed53079ecfff1b112e82987f7c54136b795c08db4a1775343690785d9fe8b43 WatchSource:0}: Error finding container 1ed53079ecfff1b112e82987f7c54136b795c08db4a1775343690785d9fe8b43: Status 404 returned error can't find the container with id 1ed53079ecfff1b112e82987f7c54136b795c08db4a1775343690785d9fe8b43 Apr 16 14:30:48.236792 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:48.236748 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" event={"ID":"f37ed6b5-8349-4076-8cc9-e35c3644b2a3","Type":"ContainerStarted","Data":"578915168caf68011d18104465a1bfc63a011c2e37df6a67ded6f5675d86c532"} Apr 16 14:30:48.237225 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:48.236797 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" event={"ID":"f37ed6b5-8349-4076-8cc9-e35c3644b2a3","Type":"ContainerStarted","Data":"1ed53079ecfff1b112e82987f7c54136b795c08db4a1775343690785d9fe8b43"} Apr 16 14:30:48.237877 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:48.237848 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerStarted","Data":"2f94c0c0ba1139f9d24280999c829172546c9adb84c49bd1ddf3c8b4c4f0ecd8"} Apr 16 14:30:49.245566 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:49.245472 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerStarted","Data":"88ae3ade40afbf6f0d4873b04422f0413ac08f01d2edef6a9fb9967115ff1e56"} Apr 16 14:30:49.247883 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:49.245622 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:50.251073 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:50.251033 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerStarted","Data":"96873a5215e7aed9d17d1481a632110928688d88ddfad156987158c1b61e058c"} Apr 16 14:30:52.258538 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:52.258502 2569 generic.go:358] "Generic (PLEG): container finished" podID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerID="578915168caf68011d18104465a1bfc63a011c2e37df6a67ded6f5675d86c532" exitCode=0 Apr 16 14:30:52.258955 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:52.258556 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" event={"ID":"f37ed6b5-8349-4076-8cc9-e35c3644b2a3","Type":"ContainerDied","Data":"578915168caf68011d18104465a1bfc63a011c2e37df6a67ded6f5675d86c532"} Apr 16 14:30:53.269947 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:53.269916 2569 generic.go:358] "Generic (PLEG): container finished" podID="01da8c6e-9c53-46bc-a992-127910e8de45" containerID="96873a5215e7aed9d17d1481a632110928688d88ddfad156987158c1b61e058c" exitCode=0 Apr 16 14:30:53.270404 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:53.269988 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerDied","Data":"96873a5215e7aed9d17d1481a632110928688d88ddfad156987158c1b61e058c"} Apr 16 14:30:53.271734 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:53.271714 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" event={"ID":"f37ed6b5-8349-4076-8cc9-e35c3644b2a3","Type":"ContainerStarted","Data":"b75ece7675e3e960e50352e5d335258103bcac884ae4a463ed69871c118be7b4"} Apr 16 14:30:53.318232 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:53.318188 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podStartSLOduration=7.318172157 podStartE2EDuration="7.318172157s" podCreationTimestamp="2026-04-16 14:30:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:30:53.317114306 +0000 UTC m=+1893.510930253" watchObservedRunningTime="2026-04-16 14:30:53.318172157 +0000 UTC m=+1893.511988093" Apr 16 14:30:54.277424 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:54.277387 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerStarted","Data":"32354bc47eda3deb1640aee64c1782e6cc7037ebc6dfce41b77ab3672c371eb1"} Apr 16 14:30:54.300996 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:54.300945 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podStartSLOduration=7.4263905359999995 podStartE2EDuration="8.30092865s" podCreationTimestamp="2026-04-16 14:30:46 +0000 UTC" firstStartedPulling="2026-04-16 14:30:47.453232319 +0000 UTC m=+1887.647048233" lastFinishedPulling="2026-04-16 14:30:48.32777042 +0000 UTC m=+1888.521586347" observedRunningTime="2026-04-16 14:30:54.299301399 +0000 UTC m=+1894.493117326" watchObservedRunningTime="2026-04-16 14:30:54.30092865 +0000 UTC m=+1894.494744584" Apr 16 14:30:57.295137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:57.295101 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:57.295137 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:57.295143 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:30:57.296751 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:57.296722 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:30:57.311563 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:57.311537 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:57.311658 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:57.311581 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:30:57.312897 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:30:57.312866 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:31:01.215117 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:01.215088 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:31:07.296402 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:07.296348 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:31:07.312173 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:07.312138 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:31:07.315012 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:07.314981 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:31:10.933312 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:10.933258 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m"] Apr 16 14:31:10.933837 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:10.933610 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="main" containerID="cri-o://ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48" gracePeriod=30 Apr 16 14:31:10.933837 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:10.933660 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="tokenizer" containerID="cri-o://cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f" gracePeriod=30 Apr 16 14:31:11.214079 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:31:11.214000 2569 logging.go:55] [core] [Channel #494 SubChannel #495]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.37:9003", ServerName: "10.132.0.37:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.37:9003: connect: connection refused" Apr 16 14:31:11.345470 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:11.345433 2569 generic.go:358] "Generic (PLEG): container finished" podID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerID="ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48" exitCode=0 Apr 16 14:31:11.345618 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:11.345507 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerDied","Data":"ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48"} Apr 16 14:31:12.214490 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.214443 2569 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.37:9003\" within 1s: context deadline exceeded" Apr 16 14:31:12.296054 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.296025 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:31:12.336945 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.336863 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kserve-provision-location\") pod \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " Apr 16 14:31:12.336945 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.336899 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-cache\") pod \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " Apr 16 14:31:12.336945 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.336924 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pdvnp\" (UniqueName: \"kubernetes.io/projected/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kube-api-access-pdvnp\") pod \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " Apr 16 14:31:12.337215 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.336995 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tls-certs\") pod \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " Apr 16 14:31:12.337215 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.337027 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-tmp\") pod \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " Apr 16 14:31:12.337215 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.337077 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-uds\") pod \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\" (UID: \"30b7247a-12d1-4724-90f2-08cb0e52bcbd\") " Apr 16 14:31:12.337532 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.337497 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "30b7247a-12d1-4724-90f2-08cb0e52bcbd" (UID: "30b7247a-12d1-4724-90f2-08cb0e52bcbd"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:12.337629 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.337590 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "30b7247a-12d1-4724-90f2-08cb0e52bcbd" (UID: "30b7247a-12d1-4724-90f2-08cb0e52bcbd"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:12.337809 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.337775 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "30b7247a-12d1-4724-90f2-08cb0e52bcbd" (UID: "30b7247a-12d1-4724-90f2-08cb0e52bcbd"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:12.337930 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.337846 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "30b7247a-12d1-4724-90f2-08cb0e52bcbd" (UID: "30b7247a-12d1-4724-90f2-08cb0e52bcbd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:31:12.340046 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.339869 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "30b7247a-12d1-4724-90f2-08cb0e52bcbd" (UID: "30b7247a-12d1-4724-90f2-08cb0e52bcbd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:31:12.340363 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.340334 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kube-api-access-pdvnp" (OuterVolumeSpecName: "kube-api-access-pdvnp") pod "30b7247a-12d1-4724-90f2-08cb0e52bcbd" (UID: "30b7247a-12d1-4724-90f2-08cb0e52bcbd"). InnerVolumeSpecName "kube-api-access-pdvnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:31:12.351588 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.351546 2569 generic.go:358] "Generic (PLEG): container finished" podID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerID="cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f" exitCode=0 Apr 16 14:31:12.351705 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.351626 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" Apr 16 14:31:12.351705 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.351672 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerDied","Data":"cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f"} Apr 16 14:31:12.351705 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.351700 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m" event={"ID":"30b7247a-12d1-4724-90f2-08cb0e52bcbd","Type":"ContainerDied","Data":"b6c707ecb3033a093f0bf83b7d452ca31cad6e060a72425c64a686e225160e31"} Apr 16 14:31:12.351932 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.351722 2569 scope.go:117] "RemoveContainer" containerID="cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f" Apr 16 14:31:12.366616 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.366593 2569 scope.go:117] "RemoveContainer" containerID="ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48" Apr 16 14:31:12.374882 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.374864 2569 scope.go:117] "RemoveContainer" containerID="bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994" Apr 16 14:31:12.383492 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.383368 2569 scope.go:117] "RemoveContainer" containerID="cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f" Apr 16 14:31:12.383693 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:31:12.383666 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f\": container with ID starting with cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f not found: ID does not exist" containerID="cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f" Apr 16 14:31:12.383757 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.383707 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f"} err="failed to get container status \"cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f\": rpc error: code = NotFound desc = could not find container \"cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f\": container with ID starting with cccdc801a0c3f7daf87712cc0c662e61a1f0423d0f87fabf5a3ef3492eb1597f not found: ID does not exist" Apr 16 14:31:12.383757 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.383730 2569 scope.go:117] "RemoveContainer" containerID="ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48" Apr 16 14:31:12.384015 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:31:12.383995 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48\": container with ID starting with ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48 not found: ID does not exist" containerID="ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48" Apr 16 14:31:12.384091 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.384034 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48"} err="failed to get container status \"ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48\": rpc error: code = NotFound desc = could not find container \"ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48\": container with ID starting with ed48ae3a75f53bf318eb6ef4eeac2653fde746e982c01309259a85a47f21ee48 not found: ID does not exist" Apr 16 14:31:12.384091 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.384058 2569 scope.go:117] "RemoveContainer" containerID="bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994" Apr 16 14:31:12.384393 ip-10-0-138-227 kubenswrapper[2569]: E0416 14:31:12.384374 2569 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994\": container with ID starting with bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994 not found: ID does not exist" containerID="bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994" Apr 16 14:31:12.384462 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.384402 2569 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994"} err="failed to get container status \"bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994\": rpc error: code = NotFound desc = could not find container \"bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994\": container with ID starting with bb0b52f38dc3e84ce14080baf8d38cc02f1e6d29959c27d7c4c91139fe049994 not found: ID does not exist" Apr 16 14:31:12.385152 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.385134 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m"] Apr 16 14:31:12.391025 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.391002 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-6476dj6v4m"] Apr 16 14:31:12.438242 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.438213 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:31:12.438242 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.438237 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:31:12.438242 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.438248 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pdvnp\" (UniqueName: \"kubernetes.io/projected/30b7247a-12d1-4724-90f2-08cb0e52bcbd-kube-api-access-pdvnp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:31:12.438528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.438257 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:31:12.438528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.438281 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-tmp\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:31:12.438528 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:12.438292 2569 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/30b7247a-12d1-4724-90f2-08cb0e52bcbd-tokenizer-uds\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:31:14.349241 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:14.349199 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" path="/var/lib/kubelet/pods/30b7247a-12d1-4724-90f2-08cb0e52bcbd/volumes" Apr 16 14:31:17.295384 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:17.295341 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:31:17.312238 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:17.312189 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:31:27.295581 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:27.295524 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:31:27.312051 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:27.312020 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:31:37.295534 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:37.295472 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:31:37.311973 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:37.311933 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:31:47.296224 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:47.296176 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:31:47.312217 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:47.312179 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:31:57.296202 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:57.296156 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:31:57.311781 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:31:57.311737 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:32:07.296092 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:07.296032 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:32:07.311878 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:07.311846 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:32:17.295447 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:17.295332 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:32:17.311572 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:17.311531 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:32:27.295705 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:27.295660 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:32:27.311447 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:27.311411 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:32:37.295817 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:37.295777 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:32:37.312261 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:37.312223 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:32:47.295688 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:47.295636 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:32:47.312668 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:47.312620 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:32:57.295517 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:57.295459 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8001/health\": dial tcp 10.132.0.38:8001: connect: connection refused" Apr 16 14:32:57.311941 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:32:57.311902 2569 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 16 14:33:07.304959 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:07.304931 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:33:07.317730 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:07.317709 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:33:07.321822 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:07.321797 2569 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:33:07.329223 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:07.329204 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:33:18.679723 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:18.679686 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj"] Apr 16 14:33:18.680871 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:18.680801 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" containerID="cri-o://32354bc47eda3deb1640aee64c1782e6cc7037ebc6dfce41b77ab3672c371eb1" gracePeriod=30 Apr 16 14:33:18.682960 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:18.682938 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7"] Apr 16 14:33:18.683180 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:18.683157 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" containerID="cri-o://b75ece7675e3e960e50352e5d335258103bcac884ae4a463ed69871c118be7b4" gracePeriod=30 Apr 16 14:33:35.037409 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:35.037377 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:35.055376 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:35.055346 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:35.069076 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:35.069051 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:35.091637 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:35.091614 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:35.101762 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:35.101742 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:36.151419 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:36.151392 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:36.162996 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:36.162959 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:36.175072 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:36.175052 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:36.199722 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:36.199692 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:36.208531 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:36.208510 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:37.220869 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:37.220827 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:37.229260 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:37.229236 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:37.253240 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:37.253219 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:37.285864 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:37.285843 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:37.299078 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:37.299057 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:38.285106 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:38.285079 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:38.293571 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:38.293549 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:38.305961 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:38.305941 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:38.330223 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:38.330199 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:38.340361 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:38.340331 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:39.342456 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:39.342378 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:39.350635 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:39.350610 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:39.363904 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:39.363877 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:39.384917 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:39.384891 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:39.393451 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:39.393427 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:40.385904 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:40.385880 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:40.399189 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:40.399160 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:40.419619 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:40.419594 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:40.448232 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:40.448208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:40.470793 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:40.470760 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:41.480711 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:41.480679 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:41.490285 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:41.490240 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:41.503177 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:41.503149 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:41.525935 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:41.525916 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:41.535348 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:41.535326 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:42.496810 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:42.496783 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:42.504437 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:42.504409 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:42.516588 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:42.516562 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:42.538354 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:42.538332 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:42.546635 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:42.546612 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:43.508881 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:43.508851 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:43.518832 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:43.518809 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:43.531221 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:43.531183 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:43.556028 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:43.556010 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:43.563602 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:43.563582 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:44.531321 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:44.531232 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:44.540319 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:44.540296 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:44.551853 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:44.551829 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:44.573190 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:44.573163 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:44.581529 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:44.581510 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:45.584979 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:45.584951 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:45.592812 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:45.592789 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:45.604693 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:45.604639 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:45.625867 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:45.625845 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:45.634743 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:45.634724 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:46.651819 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:46.651782 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:46.661832 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:46.661808 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:46.672861 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:46.672833 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:46.698125 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:46.698094 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:46.707306 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:46.707262 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:47.863372 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:47.863343 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:47.874139 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:47.874117 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:47.885800 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:47.885767 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:47.910446 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:47.910423 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:47.919702 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:47.919686 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:48.681586 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.681545 2569 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="llm-d-routing-sidecar" containerID="cri-o://88ae3ade40afbf6f0d4873b04422f0413ac08f01d2edef6a9fb9967115ff1e56" gracePeriod=2 Apr 16 14:33:48.876713 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.876689 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:48.877366 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.877343 2569 generic.go:358] "Generic (PLEG): container finished" podID="01da8c6e-9c53-46bc-a992-127910e8de45" containerID="32354bc47eda3deb1640aee64c1782e6cc7037ebc6dfce41b77ab3672c371eb1" exitCode=137 Apr 16 14:33:48.877366 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.877367 2569 generic.go:358] "Generic (PLEG): container finished" podID="01da8c6e-9c53-46bc-a992-127910e8de45" containerID="88ae3ade40afbf6f0d4873b04422f0413ac08f01d2edef6a9fb9967115ff1e56" exitCode=0 Apr 16 14:33:48.877514 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.877384 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerDied","Data":"32354bc47eda3deb1640aee64c1782e6cc7037ebc6dfce41b77ab3672c371eb1"} Apr 16 14:33:48.877514 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.877412 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerDied","Data":"88ae3ade40afbf6f0d4873b04422f0413ac08f01d2edef6a9fb9967115ff1e56"} Apr 16 14:33:48.878852 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.878828 2569 generic.go:358] "Generic (PLEG): container finished" podID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerID="b75ece7675e3e960e50352e5d335258103bcac884ae4a463ed69871c118be7b4" exitCode=137 Apr 16 14:33:48.878928 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.878877 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" event={"ID":"f37ed6b5-8349-4076-8cc9-e35c3644b2a3","Type":"ContainerDied","Data":"b75ece7675e3e960e50352e5d335258103bcac884ae4a463ed69871c118be7b4"} Apr 16 14:33:48.950893 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.950876 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:48.951602 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.951586 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:33:48.954877 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:48.954854 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:33:49.025838 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.025812 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:49.043390 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.043366 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/llm-d-routing-sidecar/0.log" Apr 16 14:33:49.049092 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049074 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-home\") pod \"01da8c6e-9c53-46bc-a992-127910e8de45\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " Apr 16 14:33:49.049175 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049109 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thkjb\" (UniqueName: \"kubernetes.io/projected/01da8c6e-9c53-46bc-a992-127910e8de45-kube-api-access-thkjb\") pod \"01da8c6e-9c53-46bc-a992-127910e8de45\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " Apr 16 14:33:49.049175 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049125 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-home\") pod \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " Apr 16 14:33:49.049296 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049184 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-dshm\") pod \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " Apr 16 14:33:49.049296 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049230 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-model-cache\") pod \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " Apr 16 14:33:49.049296 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049259 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kserve-provision-location\") pod \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " Apr 16 14:33:49.049454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049328 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-kserve-provision-location\") pod \"01da8c6e-9c53-46bc-a992-127910e8de45\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " Apr 16 14:33:49.049454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049353 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xv4mw\" (UniqueName: \"kubernetes.io/projected/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kube-api-access-xv4mw\") pod \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " Apr 16 14:33:49.049454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049399 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01da8c6e-9c53-46bc-a992-127910e8de45-tls-certs\") pod \"01da8c6e-9c53-46bc-a992-127910e8de45\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " Apr 16 14:33:49.049454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049423 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-model-cache\") pod \"01da8c6e-9c53-46bc-a992-127910e8de45\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " Apr 16 14:33:49.049454 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049452 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-dshm\") pod \"01da8c6e-9c53-46bc-a992-127910e8de45\" (UID: \"01da8c6e-9c53-46bc-a992-127910e8de45\") " Apr 16 14:33:49.049712 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049480 2569 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-tls-certs\") pod \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\" (UID: \"f37ed6b5-8349-4076-8cc9-e35c3644b2a3\") " Apr 16 14:33:49.049712 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049514 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-home" (OuterVolumeSpecName: "home") pod "01da8c6e-9c53-46bc-a992-127910e8de45" (UID: "01da8c6e-9c53-46bc-a992-127910e8de45"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.049817 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049726 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-home\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.049817 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049523 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-home" (OuterVolumeSpecName: "home") pod "f37ed6b5-8349-4076-8cc9-e35c3644b2a3" (UID: "f37ed6b5-8349-4076-8cc9-e35c3644b2a3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.049817 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049716 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-model-cache" (OuterVolumeSpecName: "model-cache") pod "f37ed6b5-8349-4076-8cc9-e35c3644b2a3" (UID: "f37ed6b5-8349-4076-8cc9-e35c3644b2a3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.049817 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.049799 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-model-cache" (OuterVolumeSpecName: "model-cache") pod "01da8c6e-9c53-46bc-a992-127910e8de45" (UID: "01da8c6e-9c53-46bc-a992-127910e8de45"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.051898 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.051872 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f37ed6b5-8349-4076-8cc9-e35c3644b2a3" (UID: "f37ed6b5-8349-4076-8cc9-e35c3644b2a3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:49.052022 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.051916 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/01da8c6e-9c53-46bc-a992-127910e8de45-kube-api-access-thkjb" (OuterVolumeSpecName: "kube-api-access-thkjb") pod "01da8c6e-9c53-46bc-a992-127910e8de45" (UID: "01da8c6e-9c53-46bc-a992-127910e8de45"). InnerVolumeSpecName "kube-api-access-thkjb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:49.052109 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.052084 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/01da8c6e-9c53-46bc-a992-127910e8de45-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "01da8c6e-9c53-46bc-a992-127910e8de45" (UID: "01da8c6e-9c53-46bc-a992-127910e8de45"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 16 14:33:49.052392 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.052360 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-dshm" (OuterVolumeSpecName: "dshm") pod "f37ed6b5-8349-4076-8cc9-e35c3644b2a3" (UID: "f37ed6b5-8349-4076-8cc9-e35c3644b2a3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.052489 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.052465 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-dshm" (OuterVolumeSpecName: "dshm") pod "01da8c6e-9c53-46bc-a992-127910e8de45" (UID: "01da8c6e-9c53-46bc-a992-127910e8de45"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.052873 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.052855 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kube-api-access-xv4mw" (OuterVolumeSpecName: "kube-api-access-xv4mw") pod "f37ed6b5-8349-4076-8cc9-e35c3644b2a3" (UID: "f37ed6b5-8349-4076-8cc9-e35c3644b2a3"). InnerVolumeSpecName "kube-api-access-xv4mw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 16 14:33:49.056124 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.056102 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/storage-initializer/0.log" Apr 16 14:33:49.082305 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.082284 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/main/0.log" Apr 16 14:33:49.092105 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.092083 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7_f37ed6b5-8349-4076-8cc9-e35c3644b2a3/storage-initializer/0.log" Apr 16 14:33:49.114513 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.114479 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f37ed6b5-8349-4076-8cc9-e35c3644b2a3" (UID: "f37ed6b5-8349-4076-8cc9-e35c3644b2a3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.114618 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.114594 2569 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "01da8c6e-9c53-46bc-a992-127910e8de45" (UID: "01da8c6e-9c53-46bc-a992-127910e8de45"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 16 14:33:49.150176 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150153 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-dshm\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150176 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150172 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-model-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150181 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150191 2569 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-kserve-provision-location\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150201 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xv4mw\" (UniqueName: \"kubernetes.io/projected/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-kube-api-access-xv4mw\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150212 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/01da8c6e-9c53-46bc-a992-127910e8de45-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150235 2569 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-model-cache\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150243 2569 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/01da8c6e-9c53-46bc-a992-127910e8de45-dshm\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150251 2569 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-tls-certs\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150259 2569 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thkjb\" (UniqueName: \"kubernetes.io/projected/01da8c6e-9c53-46bc-a992-127910e8de45-kube-api-access-thkjb\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.150318 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.150288 2569 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/f37ed6b5-8349-4076-8cc9-e35c3644b2a3-home\") on node \"ip-10-0-138-227.ec2.internal\" DevicePath \"\"" Apr 16 14:33:49.883219 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.883194 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj_01da8c6e-9c53-46bc-a992-127910e8de45/main/0.log" Apr 16 14:33:49.883873 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.883849 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" Apr 16 14:33:49.884009 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.883845 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj" event={"ID":"01da8c6e-9c53-46bc-a992-127910e8de45","Type":"ContainerDied","Data":"2f94c0c0ba1139f9d24280999c829172546c9adb84c49bd1ddf3c8b4c4f0ecd8"} Apr 16 14:33:49.884009 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.883966 2569 scope.go:117] "RemoveContainer" containerID="32354bc47eda3deb1640aee64c1782e6cc7037ebc6dfce41b77ab3672c371eb1" Apr 16 14:33:49.885346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.885319 2569 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" Apr 16 14:33:49.885346 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.885334 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7" event={"ID":"f37ed6b5-8349-4076-8cc9-e35c3644b2a3","Type":"ContainerDied","Data":"1ed53079ecfff1b112e82987f7c54136b795c08db4a1775343690785d9fe8b43"} Apr 16 14:33:49.906762 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.906740 2569 scope.go:117] "RemoveContainer" containerID="96873a5215e7aed9d17d1481a632110928688d88ddfad156987158c1b61e058c" Apr 16 14:33:49.912578 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.912554 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj"] Apr 16 14:33:49.915707 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.915481 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-7bbbfd96c9-bbdxj"] Apr 16 14:33:49.927581 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.927558 2569 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7"] Apr 16 14:33:49.932199 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.932171 2569 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-76fd6697cb-wbjt7"] Apr 16 14:33:49.973410 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.973384 2569 scope.go:117] "RemoveContainer" containerID="88ae3ade40afbf6f0d4873b04422f0413ac08f01d2edef6a9fb9967115ff1e56" Apr 16 14:33:49.980787 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.980766 2569 scope.go:117] "RemoveContainer" containerID="b75ece7675e3e960e50352e5d335258103bcac884ae4a463ed69871c118be7b4" Apr 16 14:33:49.999371 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:49.999351 2569 scope.go:117] "RemoveContainer" containerID="578915168caf68011d18104465a1bfc63a011c2e37df6a67ded6f5675d86c532" Apr 16 14:33:50.222429 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:50.222394 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-tpb64_a32a6135-bf34-4999-bce9-2bf65c6ec74a/discovery/0.log" Apr 16 14:33:50.347043 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:50.347003 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" path="/var/lib/kubelet/pods/01da8c6e-9c53-46bc-a992-127910e8de45/volumes" Apr 16 14:33:50.347590 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:50.347569 2569 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" path="/var/lib/kubelet/pods/f37ed6b5-8349-4076-8cc9-e35c3644b2a3/volumes" Apr 16 14:33:51.324251 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:51.324222 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-tpb64_a32a6135-bf34-4999-bce9-2bf65c6ec74a/discovery/0.log" Apr 16 14:33:52.753635 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:52.753604 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-c228b_9b559eee-5fae-42f4-b45b-6df0c23fac72/manager/0.log" Apr 16 14:33:52.832368 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:52.832339 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-4bk89_d6b0287d-02fb-422c-9642-d0cabe3c48e4/manager/0.log" Apr 16 14:33:55.380816 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.380776 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz48r/must-gather-7g2n8"] Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381091 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="storage-initializer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381102 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="storage-initializer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381113 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="tokenizer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381118 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="tokenizer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381127 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="storage-initializer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381133 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="storage-initializer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381140 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="storage-initializer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381145 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="storage-initializer" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381154 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381159 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381165 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381171 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381182 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="main" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381187 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="main" Apr 16 14:33:55.381187 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381194 2569 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="llm-d-routing-sidecar" Apr 16 14:33:55.381715 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381199 2569 state_mem.go:107] "Deleted CPUSet assignment" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="llm-d-routing-sidecar" Apr 16 14:33:55.381715 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381243 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="main" Apr 16 14:33:55.381715 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381252 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="30b7247a-12d1-4724-90f2-08cb0e52bcbd" containerName="tokenizer" Apr 16 14:33:55.381715 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381258 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="main" Apr 16 14:33:55.381715 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381278 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="f37ed6b5-8349-4076-8cc9-e35c3644b2a3" containerName="main" Apr 16 14:33:55.381715 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.381284 2569 memory_manager.go:356] "RemoveStaleState removing state" podUID="01da8c6e-9c53-46bc-a992-127910e8de45" containerName="llm-d-routing-sidecar" Apr 16 14:33:55.384011 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.383995 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.387484 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.387463 2569 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-kz48r\"/\"default-dockercfg-spcjg\"" Apr 16 14:33:55.388738 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.388719 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kz48r\"/\"kube-root-ca.crt\"" Apr 16 14:33:55.388738 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.388733 2569 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-kz48r\"/\"openshift-service-ca.crt\"" Apr 16 14:33:55.399567 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.399543 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/must-gather-7g2n8"] Apr 16 14:33:55.497117 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.497087 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-n85cg\" (UniqueName: \"kubernetes.io/projected/974721c5-4b2a-470c-b09e-bf5c9a99db5b-kube-api-access-n85cg\") pod \"must-gather-7g2n8\" (UID: \"974721c5-4b2a-470c-b09e-bf5c9a99db5b\") " pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.497117 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.497126 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/974721c5-4b2a-470c-b09e-bf5c9a99db5b-must-gather-output\") pod \"must-gather-7g2n8\" (UID: \"974721c5-4b2a-470c-b09e-bf5c9a99db5b\") " pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.598157 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.598119 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-n85cg\" (UniqueName: \"kubernetes.io/projected/974721c5-4b2a-470c-b09e-bf5c9a99db5b-kube-api-access-n85cg\") pod \"must-gather-7g2n8\" (UID: \"974721c5-4b2a-470c-b09e-bf5c9a99db5b\") " pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.598360 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.598164 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/974721c5-4b2a-470c-b09e-bf5c9a99db5b-must-gather-output\") pod \"must-gather-7g2n8\" (UID: \"974721c5-4b2a-470c-b09e-bf5c9a99db5b\") " pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.598569 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.598551 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/974721c5-4b2a-470c-b09e-bf5c9a99db5b-must-gather-output\") pod \"must-gather-7g2n8\" (UID: \"974721c5-4b2a-470c-b09e-bf5c9a99db5b\") " pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.606886 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.606869 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-n85cg\" (UniqueName: \"kubernetes.io/projected/974721c5-4b2a-470c-b09e-bf5c9a99db5b-kube-api-access-n85cg\") pod \"must-gather-7g2n8\" (UID: \"974721c5-4b2a-470c-b09e-bf5c9a99db5b\") " pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:55.692805 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:55.692725 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/must-gather-7g2n8" Apr 16 14:33:56.029521 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:56.029497 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/must-gather-7g2n8"] Apr 16 14:33:56.031997 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:33:56.031968 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod974721c5_4b2a_470c_b09e_bf5c9a99db5b.slice/crio-2d9e9f16aff665d5f780262cf23fa1ca185d1c1fe1c47e597660494e63c834fc WatchSource:0}: Error finding container 2d9e9f16aff665d5f780262cf23fa1ca185d1c1fe1c47e597660494e63c834fc: Status 404 returned error can't find the container with id 2d9e9f16aff665d5f780262cf23fa1ca185d1c1fe1c47e597660494e63c834fc Apr 16 14:33:56.033802 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:56.033779 2569 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 16 14:33:56.909016 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:56.908992 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/must-gather-7g2n8" event={"ID":"974721c5-4b2a-470c-b09e-bf5c9a99db5b","Type":"ContainerStarted","Data":"2d9e9f16aff665d5f780262cf23fa1ca185d1c1fe1c47e597660494e63c834fc"} Apr 16 14:33:57.914190 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:57.914154 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/must-gather-7g2n8" event={"ID":"974721c5-4b2a-470c-b09e-bf5c9a99db5b","Type":"ContainerStarted","Data":"b04254020c05c84d1ff9858cd56e8df86db01c19fb9bddf30d8fb7d59aad1041"} Apr 16 14:33:57.914190 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:57.914189 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/must-gather-7g2n8" event={"ID":"974721c5-4b2a-470c-b09e-bf5c9a99db5b","Type":"ContainerStarted","Data":"ff333f63f5934806d03848bfce20ef7ece3a806c2dc8842bfb67332654b211ca"} Apr 16 14:33:57.932381 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:57.932327 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kz48r/must-gather-7g2n8" podStartSLOduration=2.156164512 podStartE2EDuration="2.932313186s" podCreationTimestamp="2026-04-16 14:33:55 +0000 UTC" firstStartedPulling="2026-04-16 14:33:56.033970414 +0000 UTC m=+2076.227786337" lastFinishedPulling="2026-04-16 14:33:56.810119098 +0000 UTC m=+2077.003935011" observedRunningTime="2026-04-16 14:33:57.930392608 +0000 UTC m=+2078.124208545" watchObservedRunningTime="2026-04-16 14:33:57.932313186 +0000 UTC m=+2078.126129188" Apr 16 14:33:58.238762 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:58.238671 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-fmbxr_09070b7d-cdb7-4268-8a8b-096e6b5cff88/global-pull-secret-syncer/0.log" Apr 16 14:33:58.359869 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:58.359833 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-xc2dz_865317bf-68d0-4437-94d2-7e1b8f99dbb1/konnectivity-agent/0.log" Apr 16 14:33:58.427944 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:33:58.427882 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-138-227.ec2.internal_ed5d0e13f67e25091a3eceaddbaeb6c3/haproxy/0.log" Apr 16 14:34:02.497836 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:02.497791 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-operator-7587b89b76-c228b_9b559eee-5fae-42f4-b45b-6df0c23fac72/manager/0.log" Apr 16 14:34:02.606029 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:02.605988 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-operator-controller-manager-6ddf9554fc-4bk89_d6b0287d-02fb-422c-9642-d0cabe3c48e4/manager/0.log" Apr 16 14:34:03.883356 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:03.883324 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7j9ch_4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914/node-exporter/0.log" Apr 16 14:34:03.903092 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:03.903062 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7j9ch_4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914/kube-rbac-proxy/0.log" Apr 16 14:34:03.925156 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:03.925127 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-7j9ch_4c2a5a7c-50f6-4b62-8d7c-4f392ea3a914/init-textfile/0.log" Apr 16 14:34:07.243712 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.243678 2569 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx"] Apr 16 14:34:07.249289 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.249251 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.259791 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.259765 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx"] Apr 16 14:34:07.304867 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.304829 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-podres\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.305033 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.304895 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-proc\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.305033 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.304919 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-lib-modules\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.305033 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.304973 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vj8h6\" (UniqueName: \"kubernetes.io/projected/f36d1ab7-2e2c-4027-b439-0641c88b74c5-kube-api-access-vj8h6\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.305194 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.305086 2569 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-sys\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406252 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406213 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-podres\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406252 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406258 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-proc\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406295 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-lib-modules\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406331 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vj8h6\" (UniqueName: \"kubernetes.io/projected/f36d1ab7-2e2c-4027-b439-0641c88b74c5-kube-api-access-vj8h6\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406368 2569 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-sys\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406366 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-proc\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406381 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-podres\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406432 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-sys\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.406507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.406473 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/f36d1ab7-2e2c-4027-b439-0641c88b74c5-lib-modules\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.414826 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.414781 2569 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vj8h6\" (UniqueName: \"kubernetes.io/projected/f36d1ab7-2e2c-4027-b439-0641c88b74c5-kube-api-access-vj8h6\") pod \"perf-node-gather-daemonset-8v4nx\" (UID: \"f36d1ab7-2e2c-4027-b439-0641c88b74c5\") " pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.561596 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.561515 2569 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.713107 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.713083 2569 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx"] Apr 16 14:34:07.715553 ip-10-0-138-227 kubenswrapper[2569]: W0416 14:34:07.715525 2569 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podf36d1ab7_2e2c_4027_b439_0641c88b74c5.slice/crio-9e4364651a5806b6ab0412cfb5c1cb0e53081787ba34fc8ca1002e66ea66c189 WatchSource:0}: Error finding container 9e4364651a5806b6ab0412cfb5c1cb0e53081787ba34fc8ca1002e66ea66c189: Status 404 returned error can't find the container with id 9e4364651a5806b6ab0412cfb5c1cb0e53081787ba34fc8ca1002e66ea66c189 Apr 16 14:34:07.942004 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.941917 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pj7tn_8e960e93-2ea2-4f10-b638-d01eb132a93d/dns/0.log" Apr 16 14:34:07.958813 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.958781 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" event={"ID":"f36d1ab7-2e2c-4027-b439-0641c88b74c5","Type":"ContainerStarted","Data":"7b1a00896f16f54805804a99fec5e6a1dc069a720c8e0f3c6a0606afe46439e9"} Apr 16 14:34:07.958813 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.958815 2569 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" event={"ID":"f36d1ab7-2e2c-4027-b439-0641c88b74c5","Type":"ContainerStarted","Data":"9e4364651a5806b6ab0412cfb5c1cb0e53081787ba34fc8ca1002e66ea66c189"} Apr 16 14:34:07.959036 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.958908 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:07.961228 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.961208 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-pj7tn_8e960e93-2ea2-4f10-b638-d01eb132a93d/kube-rbac-proxy/0.log" Apr 16 14:34:07.975430 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:07.975384 2569 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" podStartSLOduration=0.975368195 podStartE2EDuration="975.368195ms" podCreationTimestamp="2026-04-16 14:34:07 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-16 14:34:07.9746992 +0000 UTC m=+2088.168515135" watchObservedRunningTime="2026-04-16 14:34:07.975368195 +0000 UTC m=+2088.169184171" Apr 16 14:34:08.113320 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:08.113296 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-qt8cq_19600904-44a6-4d4d-aaf8-38af8d1e94b3/dns-node-resolver/0.log" Apr 16 14:34:08.559124 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:08.559096 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7qrkr_bafb680b-a4fd-4b7b-931c-6d0198bbe401/node-ca/0.log" Apr 16 14:34:09.453017 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:09.452978 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_istiod-openshift-gateway-55ff986f96-tpb64_a32a6135-bf34-4999-bce9-2bf65c6ec74a/discovery/0.log" Apr 16 14:34:09.927690 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:09.927657 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-gwt4s_1c7e2532-bb8b-4069-a258-035e33ccef02/serve-healthcheck-canary/0.log" Apr 16 14:34:10.496475 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:10.496448 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6dwk_ad87cd87-9cdc-4f67-9399-3c3befc3e0d4/kube-rbac-proxy/0.log" Apr 16 14:34:10.520580 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:10.520554 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6dwk_ad87cd87-9cdc-4f67-9399-3c3befc3e0d4/exporter/0.log" Apr 16 14:34:10.543832 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:10.543797 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-f6dwk_ad87cd87-9cdc-4f67-9399-3c3befc3e0d4/extractor/0.log" Apr 16 14:34:13.165153 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:13.165116 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-rs8j2_296ccc38-fbad-4869-915f-9dc8608b6e24/openshift-lws-operator/0.log" Apr 16 14:34:13.663455 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:13.663424 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-75d667c7c4-hrxgl_9e804e63-6427-4ecd-ad17-f5b39340b162/manager/0.log" Apr 16 14:34:13.726563 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:13.726533 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-dw5c9_c3596ae7-10db-4eae-81bb-18f75dccd1fc/server/0.log" Apr 16 14:34:13.974507 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:13.974434 2569 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-kz48r/perf-node-gather-daemonset-8v4nx" Apr 16 14:34:14.111459 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:14.111424 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-z4b9p_46cf7836-eb7e-4810-a85d-22f2e3b83333/manager/0.log" Apr 16 14:34:20.084541 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.084512 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/kube-multus-additional-cni-plugins/0.log" Apr 16 14:34:20.104930 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.104908 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/egress-router-binary-copy/0.log" Apr 16 14:34:20.124403 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.124377 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/cni-plugins/0.log" Apr 16 14:34:20.143788 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.143770 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/bond-cni-plugin/0.log" Apr 16 14:34:20.162724 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.162695 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/routeoverride-cni/0.log" Apr 16 14:34:20.182210 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.182182 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/whereabouts-cni-bincopy/0.log" Apr 16 14:34:20.202304 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.202279 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-2j4nh_5445e4c9-57ac-4c32-964e-2165416593b6/whereabouts-cni/0.log" Apr 16 14:34:20.565439 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.565402 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-lp46m_0075fe44-19cb-4f01-845d-1e50708704ff/kube-multus/0.log" Apr 16 14:34:20.660127 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.660092 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7pb8h_1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17/network-metrics-daemon/0.log" Apr 16 14:34:20.678748 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:20.678673 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-7pb8h_1e697f2d-12ee-44d4-bd4b-71aa2cf8ee17/kube-rbac-proxy/0.log" Apr 16 14:34:22.109597 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.109571 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/ovn-controller/0.log" Apr 16 14:34:22.145082 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.145058 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/ovn-acl-logging/0.log" Apr 16 14:34:22.164397 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.164376 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/kube-rbac-proxy-node/0.log" Apr 16 14:34:22.185339 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.185317 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/kube-rbac-proxy-ovn-metrics/0.log" Apr 16 14:34:22.204482 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.204452 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/northd/0.log" Apr 16 14:34:22.228880 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.228853 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/nbdb/0.log" Apr 16 14:34:22.255058 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.255040 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/sbdb/0.log" Apr 16 14:34:22.433995 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:22.433920 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-xf96q_c58f9919-25e7-4c88-ac45-816be0bc0b3a/ovnkube-controller/0.log" Apr 16 14:34:23.533410 ip-10-0-138-227 kubenswrapper[2569]: I0416 14:34:23.533370 2569 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-6spc4_cab10f5b-6eb5-409e-a6a8-a1bf534e28e2/network-check-target-container/0.log"