Apr 24 21:27:28.094285 ip-10-0-133-36 systemd[1]: Starting Kubernetes Kubelet... Apr 24 21:27:28.477269 ip-10-0-133-36 kubenswrapper[2580]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:28.477269 ip-10-0-133-36 kubenswrapper[2580]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 24 21:27:28.477269 ip-10-0-133-36 kubenswrapper[2580]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:28.477269 ip-10-0-133-36 kubenswrapper[2580]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 24 21:27:28.477269 ip-10-0-133-36 kubenswrapper[2580]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 24 21:27:28.480200 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.480111 2580 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 24 21:27:28.482977 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.482963 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:28.482977 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.482978 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.482982 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.482985 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.482989 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483005 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483009 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483012 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483015 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483018 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483021 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483024 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483027 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483030 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483040 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483043 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483046 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483049 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483052 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483055 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483058 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:28.483052 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483061 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483064 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483068 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483072 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483075 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483078 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483081 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483083 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483086 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483089 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483091 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483094 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483102 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483105 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483107 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483110 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483112 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483115 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483117 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483120 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:28.483539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483123 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483125 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483128 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483130 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483133 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483136 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483141 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483144 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483147 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483149 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483152 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483155 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483157 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483160 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483163 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483165 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483168 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483171 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483174 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483176 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:28.484071 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483179 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483181 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483184 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483186 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483189 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483192 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483195 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483198 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483201 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483204 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483206 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483209 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483211 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483214 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483216 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483218 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483221 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483223 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483227 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483229 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:28.484553 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483232 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483234 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483237 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483239 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.483242 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484936 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484943 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484947 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484951 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484954 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484957 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484960 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484962 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484965 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484968 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484971 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484973 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484977 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484980 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:28.485043 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484983 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484985 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.484988 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485001 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485004 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485007 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485010 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485012 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485014 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485017 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485020 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485022 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485025 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485027 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485030 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485034 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485036 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485039 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485041 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485044 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:28.485500 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485047 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485050 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485053 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485055 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485058 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485061 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485063 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485066 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485068 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485071 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485074 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485077 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485079 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485083 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485087 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485089 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485093 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485095 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485098 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:28.485989 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485100 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485103 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485105 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485108 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485110 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485113 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485115 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485118 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485122 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485125 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485128 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485130 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485133 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485135 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485138 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485141 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485143 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485146 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485149 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485151 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:28.486510 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485155 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485157 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485159 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485162 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485164 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485167 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485170 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485172 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485175 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485178 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485180 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485183 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485185 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485250 2580 flags.go:64] FLAG: --address="0.0.0.0" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485257 2580 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485263 2580 flags.go:64] FLAG: --anonymous-auth="true" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485267 2580 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485271 2580 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485277 2580 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485281 2580 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485290 2580 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 24 21:27:28.487008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485295 2580 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485300 2580 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485307 2580 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485310 2580 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485313 2580 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485316 2580 flags.go:64] FLAG: --cgroup-root="" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485319 2580 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485322 2580 flags.go:64] FLAG: --client-ca-file="" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485325 2580 flags.go:64] FLAG: --cloud-config="" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485328 2580 flags.go:64] FLAG: --cloud-provider="external" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485330 2580 flags.go:64] FLAG: --cluster-dns="[]" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485335 2580 flags.go:64] FLAG: --cluster-domain="" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485338 2580 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485341 2580 flags.go:64] FLAG: --config-dir="" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485344 2580 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485347 2580 flags.go:64] FLAG: --container-log-max-files="5" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485351 2580 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485355 2580 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485358 2580 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485361 2580 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485364 2580 flags.go:64] FLAG: --contention-profiling="false" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485367 2580 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485370 2580 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485373 2580 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485376 2580 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 24 21:27:28.487530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485380 2580 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485382 2580 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485385 2580 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485388 2580 flags.go:64] FLAG: --enable-load-reader="false" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485392 2580 flags.go:64] FLAG: --enable-server="true" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485395 2580 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485399 2580 flags.go:64] FLAG: --event-burst="100" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485403 2580 flags.go:64] FLAG: --event-qps="50" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485406 2580 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485409 2580 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485412 2580 flags.go:64] FLAG: --eviction-hard="" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485416 2580 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485418 2580 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485421 2580 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485424 2580 flags.go:64] FLAG: --eviction-soft="" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485427 2580 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485430 2580 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485433 2580 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485435 2580 flags.go:64] FLAG: --experimental-mounter-path="" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485438 2580 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485441 2580 flags.go:64] FLAG: --fail-swap-on="true" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485444 2580 flags.go:64] FLAG: --feature-gates="" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485448 2580 flags.go:64] FLAG: --file-check-frequency="20s" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485451 2580 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485454 2580 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 24 21:27:28.488165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485463 2580 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485467 2580 flags.go:64] FLAG: --healthz-port="10248" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485470 2580 flags.go:64] FLAG: --help="false" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485472 2580 flags.go:64] FLAG: --hostname-override="ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485476 2580 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485479 2580 flags.go:64] FLAG: --http-check-frequency="20s" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485482 2580 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485485 2580 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485489 2580 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485491 2580 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485494 2580 flags.go:64] FLAG: --image-service-endpoint="" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485497 2580 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485500 2580 flags.go:64] FLAG: --kube-api-burst="100" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485503 2580 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485506 2580 flags.go:64] FLAG: --kube-api-qps="50" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485509 2580 flags.go:64] FLAG: --kube-reserved="" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485512 2580 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485516 2580 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485519 2580 flags.go:64] FLAG: --kubelet-cgroups="" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485522 2580 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485525 2580 flags.go:64] FLAG: --lock-file="" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485528 2580 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485531 2580 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485533 2580 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 24 21:27:28.488763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485538 2580 flags.go:64] FLAG: --log-json-split-stream="false" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485541 2580 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485544 2580 flags.go:64] FLAG: --log-text-split-stream="false" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485547 2580 flags.go:64] FLAG: --logging-format="text" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485550 2580 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485553 2580 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485556 2580 flags.go:64] FLAG: --manifest-url="" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485559 2580 flags.go:64] FLAG: --manifest-url-header="" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485563 2580 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485572 2580 flags.go:64] FLAG: --max-open-files="1000000" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485576 2580 flags.go:64] FLAG: --max-pods="110" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485579 2580 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485582 2580 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485585 2580 flags.go:64] FLAG: --memory-manager-policy="None" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485588 2580 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485591 2580 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485594 2580 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485597 2580 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485604 2580 flags.go:64] FLAG: --node-status-max-images="50" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485607 2580 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485611 2580 flags.go:64] FLAG: --oom-score-adj="-999" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485614 2580 flags.go:64] FLAG: --pod-cidr="" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485617 2580 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 24 21:27:28.489354 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485627 2580 flags.go:64] FLAG: --pod-manifest-path="" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485630 2580 flags.go:64] FLAG: --pod-max-pids="-1" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485633 2580 flags.go:64] FLAG: --pods-per-core="0" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485636 2580 flags.go:64] FLAG: --port="10250" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485639 2580 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485642 2580 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0638b6f7b194c3dcb" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485645 2580 flags.go:64] FLAG: --qos-reserved="" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485647 2580 flags.go:64] FLAG: --read-only-port="10255" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485650 2580 flags.go:64] FLAG: --register-node="true" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485653 2580 flags.go:64] FLAG: --register-schedulable="true" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485656 2580 flags.go:64] FLAG: --register-with-taints="" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485663 2580 flags.go:64] FLAG: --registry-burst="10" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485666 2580 flags.go:64] FLAG: --registry-qps="5" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485669 2580 flags.go:64] FLAG: --reserved-cpus="" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485671 2580 flags.go:64] FLAG: --reserved-memory="" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485675 2580 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485678 2580 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485681 2580 flags.go:64] FLAG: --rotate-certificates="false" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485684 2580 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485693 2580 flags.go:64] FLAG: --runonce="false" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485696 2580 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485701 2580 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485704 2580 flags.go:64] FLAG: --seccomp-default="false" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485707 2580 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485710 2580 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485713 2580 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 24 21:27:28.489897 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485716 2580 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485719 2580 flags.go:64] FLAG: --storage-driver-password="root" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485722 2580 flags.go:64] FLAG: --storage-driver-secure="false" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485724 2580 flags.go:64] FLAG: --storage-driver-table="stats" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485727 2580 flags.go:64] FLAG: --storage-driver-user="root" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485730 2580 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485733 2580 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485737 2580 flags.go:64] FLAG: --system-cgroups="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485740 2580 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485745 2580 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485748 2580 flags.go:64] FLAG: --tls-cert-file="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485751 2580 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485757 2580 flags.go:64] FLAG: --tls-min-version="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485760 2580 flags.go:64] FLAG: --tls-private-key-file="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485763 2580 flags.go:64] FLAG: --topology-manager-policy="none" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485766 2580 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485769 2580 flags.go:64] FLAG: --topology-manager-scope="container" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485772 2580 flags.go:64] FLAG: --v="2" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485776 2580 flags.go:64] FLAG: --version="false" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485780 2580 flags.go:64] FLAG: --vmodule="" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485784 2580 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.485787 2580 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485875 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485878 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:28.490522 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485881 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485883 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485887 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485891 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485894 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485896 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485899 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485901 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485904 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485907 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485910 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485912 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485915 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485918 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485920 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485923 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485926 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485930 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485934 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:28.491110 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485936 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485939 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485942 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485944 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485947 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485950 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485952 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485955 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485957 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485960 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485962 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485965 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485967 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485970 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485972 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485975 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485981 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485984 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485986 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.485989 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:28.491652 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486005 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486008 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486011 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486014 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486016 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486019 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486025 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486028 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486030 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486033 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486035 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486038 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486041 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486043 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486046 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486048 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486051 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486054 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486056 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486059 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:28.492177 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486061 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486064 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486066 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486069 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486073 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486076 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486079 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486082 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486086 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486092 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486094 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486097 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486101 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486104 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486107 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486109 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486112 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486114 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486117 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:28.492667 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486120 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:28.493153 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486122 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:28.493153 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486125 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:28.493153 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486127 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:28.493153 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486129 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:28.493153 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.486132 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:28.493153 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.486892 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:28.494227 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.494120 2580 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 24 21:27:28.494266 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.494229 2580 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 24 21:27:28.494294 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494290 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494296 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494300 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494304 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494307 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494310 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494313 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494316 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494319 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494322 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494325 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:28.494323 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494328 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494331 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494334 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494336 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494339 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494342 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494345 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494347 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494350 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494353 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494357 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494361 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494364 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494367 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494370 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494373 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494376 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494379 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494381 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:28.494600 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494384 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494387 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494390 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494393 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494396 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494399 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494402 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494405 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494407 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494410 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494413 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494415 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494418 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494421 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494423 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494427 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494430 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494432 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494435 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:28.495106 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494438 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494441 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494443 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494446 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494448 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494451 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494453 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494456 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494459 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494461 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494464 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494467 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494469 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494472 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494474 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494477 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494480 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494482 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494485 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494487 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:28.495570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494489 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494492 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494494 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494497 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494499 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494503 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494506 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494509 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494512 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494514 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494518 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494521 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494524 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494526 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494529 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494531 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:28.496059 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494534 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.494539 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494645 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494650 2580 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494653 2580 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494656 2580 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494659 2580 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494662 2580 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494664 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494667 2580 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494670 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494672 2580 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494675 2580 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494678 2580 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494681 2580 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494683 2580 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 24 21:27:28.496497 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494686 2580 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494688 2580 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494691 2580 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494693 2580 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494696 2580 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494698 2580 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494701 2580 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494704 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494707 2580 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494710 2580 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494714 2580 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494716 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494720 2580 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494724 2580 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494728 2580 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494731 2580 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494734 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494737 2580 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494739 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 24 21:27:28.496897 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494742 2580 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494745 2580 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494748 2580 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494751 2580 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494755 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494758 2580 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494761 2580 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494764 2580 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494767 2580 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494770 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494773 2580 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494775 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494778 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494781 2580 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494783 2580 feature_gate.go:328] unrecognized feature gate: Example2 Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494786 2580 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494788 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494791 2580 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494793 2580 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 24 21:27:28.497378 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494796 2580 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494799 2580 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494801 2580 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494804 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494807 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494809 2580 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494812 2580 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494815 2580 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494817 2580 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494820 2580 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494823 2580 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494826 2580 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494828 2580 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494831 2580 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494833 2580 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494836 2580 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494838 2580 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494840 2580 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494843 2580 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494846 2580 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 24 21:27:28.497858 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494848 2580 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494851 2580 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494853 2580 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494856 2580 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494859 2580 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494861 2580 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494864 2580 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494867 2580 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494869 2580 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494871 2580 feature_gate.go:328] unrecognized feature gate: Example Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494874 2580 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494876 2580 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494879 2580 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:28.494881 2580 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.494886 2580 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 24 21:27:28.498380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.495702 2580 server.go:962] "Client rotation is on, will bootstrap in background" Apr 24 21:27:28.498801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.497588 2580 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 24 21:27:28.498801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.498573 2580 server.go:1019] "Starting client certificate rotation" Apr 24 21:27:28.498801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.498668 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:28.498801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.498702 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 24 21:27:28.522841 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.522814 2580 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:28.528780 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.528759 2580 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 24 21:27:28.542268 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.542247 2580 log.go:25] "Validated CRI v1 runtime API" Apr 24 21:27:28.547324 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.547311 2580 log.go:25] "Validated CRI v1 image API" Apr 24 21:27:28.549747 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.549730 2580 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 24 21:27:28.554418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.552894 2580 fs.go:135] Filesystem UUIDs: map[73637227-0665-4294-9391-fbd1e0301841:/dev/nvme0n1p3 7B77-95E7:/dev/nvme0n1p2 e96895a6-8a9e-471e-8cb6-c4208abd185a:/dev/nvme0n1p4] Apr 24 21:27:28.554418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.554408 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:28.554560 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.554414 2580 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 24 21:27:28.559688 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.559575 2580 manager.go:217] Machine: {Timestamp:2026-04-24 21:27:28.558344625 +0000 UTC m=+0.356407585 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3150051 MemoryCapacity:33164496896 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec225109f1cf5d1b31081c8cc694a2be SystemUUID:ec225109-f1cf-5d1b-3108-1c8cc694a2be BootID:4332496b-bca1-4a31-8487-deb0a7c231fd Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632902656 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582250496 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:0d:dc:53:20:ef Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:0d:dc:53:20:ef Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:13:b2:b6:f1:4f Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164496896 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 24 21:27:28.559688 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.559681 2580 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 24 21:27:28.559802 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.559757 2580 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 24 21:27:28.561699 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.561674 2580 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 24 21:27:28.561841 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.561702 2580 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-36.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 24 21:27:28.561889 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.561850 2580 topology_manager.go:138] "Creating topology manager with none policy" Apr 24 21:27:28.561889 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.561859 2580 container_manager_linux.go:306] "Creating device plugin manager" Apr 24 21:27:28.561889 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.561871 2580 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:28.563233 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.563223 2580 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 24 21:27:28.564549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.564540 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:28.564655 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.564646 2580 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 24 21:27:28.567477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.567468 2580 kubelet.go:491] "Attempting to sync node with API server" Apr 24 21:27:28.567514 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.567483 2580 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 24 21:27:28.567514 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.567502 2580 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 24 21:27:28.567569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.567525 2580 kubelet.go:397] "Adding apiserver pod source" Apr 24 21:27:28.567569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.567544 2580 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 24 21:27:28.568598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.568587 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:28.568645 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.568605 2580 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 24 21:27:28.573748 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.573731 2580 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 24 21:27:28.576360 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.576340 2580 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 24 21:27:28.577816 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577805 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577821 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577828 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577834 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577839 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577846 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577854 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 24 21:27:28.577867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577863 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 24 21:27:28.578079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577872 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 24 21:27:28.578079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577878 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 24 21:27:28.578079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577887 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 24 21:27:28.578079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577895 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 24 21:27:28.578079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577922 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 24 21:27:28.578079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.577930 2580 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 24 21:27:28.581301 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.581288 2580 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 24 21:27:28.581351 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.581324 2580 server.go:1295] "Started kubelet" Apr 24 21:27:28.581432 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.581408 2580 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 24 21:27:28.581523 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.581482 2580 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 24 21:27:28.581571 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.581550 2580 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 24 21:27:28.582244 ip-10-0-133-36 systemd[1]: Started Kubernetes Kubelet. Apr 24 21:27:28.582807 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.582769 2580 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 24 21:27:28.583156 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.583131 2580 server.go:317] "Adding debug handlers to kubelet server" Apr 24 21:27:28.587742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.587713 2580 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-36.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 24 21:27:28.587828 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.587797 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 24 21:27:28.587882 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.587809 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 24 21:27:28.588632 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.587872 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-36.ec2.internal.18a96827e81870b3 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-36.ec2.internal,UID:ip-10-0-133-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-36.ec2.internal,},FirstTimestamp:2026-04-24 21:27:28.581300403 +0000 UTC m=+0.379363363,LastTimestamp:2026-04-24 21:27:28.581300403 +0000 UTC m=+0.379363363,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-36.ec2.internal,}" Apr 24 21:27:28.588853 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.588827 2580 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:28.589264 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.589248 2580 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 24 21:27:28.590077 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590059 2580 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 24 21:27:28.590211 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590061 2580 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 24 21:27:28.590211 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590212 2580 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 24 21:27:28.590370 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.590074 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:28.590370 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590264 2580 reconstruct.go:97] "Volume reconstruction finished" Apr 24 21:27:28.590370 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590273 2580 reconciler.go:26] "Reconciler: start to sync state" Apr 24 21:27:28.590540 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590525 2580 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 24 21:27:28.590589 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590543 2580 factory.go:55] Registering systemd factory Apr 24 21:27:28.590589 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590554 2580 factory.go:223] Registration of the systemd container factory successfully Apr 24 21:27:28.590785 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590771 2580 factory.go:153] Registering CRI-O factory Apr 24 21:27:28.590842 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590788 2580 factory.go:223] Registration of the crio container factory successfully Apr 24 21:27:28.590842 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590809 2580 factory.go:103] Registering Raw factory Apr 24 21:27:28.590842 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.590822 2580 manager.go:1196] Started watching for new ooms in manager Apr 24 21:27:28.591252 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.591239 2580 manager.go:319] Starting recovery of all containers Apr 24 21:27:28.592705 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.592441 2580 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 24 21:27:28.594402 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.594372 2580 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 24 21:27:28.594525 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.594499 2580 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-36.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 24 21:27:28.602449 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.602433 2580 manager.go:324] Recovery completed Apr 24 21:27:28.606877 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.606866 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.607964 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.607946 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sq75z" Apr 24 21:27:28.609160 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.609144 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.609219 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.609169 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.609219 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.609179 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.609630 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.609616 2580 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 24 21:27:28.609630 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.609628 2580 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 24 21:27:28.609729 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.609645 2580 state_mem.go:36] "Initialized new in-memory state store" Apr 24 21:27:28.611971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.611958 2580 policy_none.go:49] "None policy: Start" Apr 24 21:27:28.612030 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.611974 2580 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 24 21:27:28.612030 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.611984 2580 state_mem.go:35] "Initializing new in-memory state store" Apr 24 21:27:28.613487 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.613409 2580 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-36.ec2.internal.18a96827e9c17f1d default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-36.ec2.internal,UID:ip-10-0-133-36.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:NodeHasSufficientMemory,Message:Node ip-10-0-133-36.ec2.internal status is now: NodeHasSufficientMemory,Source:EventSource{Component:kubelet,Host:ip-10-0-133-36.ec2.internal,},FirstTimestamp:2026-04-24 21:27:28.609156893 +0000 UTC m=+0.407219854,LastTimestamp:2026-04-24 21:27:28.609156893 +0000 UTC m=+0.407219854,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-36.ec2.internal,}" Apr 24 21:27:28.615844 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.615823 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-sq75z" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653511 2580 manager.go:341] "Starting Device Plugin manager" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.653553 2580 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653563 2580 server.go:85] "Starting device plugin registration server" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653773 2580 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653785 2580 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653872 2580 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653940 2580 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.653949 2580 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.654360 2580 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 24 21:27:28.663738 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.654386 2580 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:28.682365 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.682344 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 24 21:27:28.683513 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.683490 2580 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 24 21:27:28.683513 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.683515 2580 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 24 21:27:28.683648 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.683533 2580 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 24 21:27:28.683648 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.683541 2580 kubelet.go:2451] "Starting kubelet main sync loop" Apr 24 21:27:28.683648 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.683578 2580 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 24 21:27:28.685812 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.685793 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:28.753913 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.753873 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.754638 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.754623 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.754709 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.754651 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.754709 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.754662 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.754709 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.754685 2580 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.766929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.766912 2580 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.766977 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.766932 2580 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-36.ec2.internal\": node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:28.784102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.784078 2580 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal"] Apr 24 21:27:28.784154 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.784141 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.784503 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.784491 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:28.784851 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.784836 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.784929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.784866 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.784929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.784878 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.787037 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787022 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.787148 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787120 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.787148 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787149 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.787630 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787613 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.787630 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787630 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.787725 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787644 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.787725 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787646 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.787725 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787678 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.787725 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.787662 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.789876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.789860 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.789961 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.789885 2580 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 24 21:27:28.790424 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.790410 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientMemory" Apr 24 21:27:28.790489 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.790437 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasNoDiskPressure" Apr 24 21:27:28.790489 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.790455 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeHasSufficientPID" Apr 24 21:27:28.791784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.791771 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/baed0c904fe265b9d7df8c0cd38921b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal\" (UID: \"baed0c904fe265b9d7df8c0cd38921b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.791846 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.791793 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/baed0c904fe265b9d7df8c0cd38921b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal\" (UID: \"baed0c904fe265b9d7df8c0cd38921b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.791846 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.791813 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4fcab00268e8e626e83b344c210c25fb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-36.ec2.internal\" (UID: \"4fcab00268e8e626e83b344c210c25fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.812693 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.812674 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-36.ec2.internal\" not found" node="ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.816833 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.816817 2580 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-36.ec2.internal\" not found" node="ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.884969 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.884947 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:28.892314 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.892295 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4fcab00268e8e626e83b344c210c25fb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-36.ec2.internal\" (UID: \"4fcab00268e8e626e83b344c210c25fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.892397 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.892325 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/baed0c904fe265b9d7df8c0cd38921b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal\" (UID: \"baed0c904fe265b9d7df8c0cd38921b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.892397 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.892352 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/baed0c904fe265b9d7df8c0cd38921b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal\" (UID: \"baed0c904fe265b9d7df8c0cd38921b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.892489 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.892398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/baed0c904fe265b9d7df8c0cd38921b3-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal\" (UID: \"baed0c904fe265b9d7df8c0cd38921b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.892489 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.892401 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/baed0c904fe265b9d7df8c0cd38921b3-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal\" (UID: \"baed0c904fe265b9d7df8c0cd38921b3\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.892489 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:28.892399 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/4fcab00268e8e626e83b344c210c25fb-config\") pod \"kube-apiserver-proxy-ip-10-0-133-36.ec2.internal\" (UID: \"4fcab00268e8e626e83b344c210c25fb\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" Apr 24 21:27:28.985459 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:28.985426 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.086241 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.086170 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.114610 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.114589 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:29.119289 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.119271 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" Apr 24 21:27:29.186561 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.186516 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.287079 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.287037 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.387666 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.387580 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.488476 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.488446 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.498899 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.498876 2580 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 24 21:27:29.499039 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.499016 2580 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 24 21:27:29.588550 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.588519 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.589070 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.589056 2580 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 24 21:27:29.608755 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.607323 2580 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 24 21:27:29.618190 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.618144 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-23 21:22:28 +0000 UTC" deadline="2028-01-09 06:12:52.950501674 +0000 UTC" Apr 24 21:27:29.618190 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.618184 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="14984h45m23.332320535s" Apr 24 21:27:29.637847 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.637806 2580 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-5mfpw" Apr 24 21:27:29.640770 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:29.640737 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbaed0c904fe265b9d7df8c0cd38921b3.slice/crio-a21397d02eeb78e503de902aaec3fcf780dbcd76e0af963f0fd72434dd3e901b WatchSource:0}: Error finding container a21397d02eeb78e503de902aaec3fcf780dbcd76e0af963f0fd72434dd3e901b: Status 404 returned error can't find the container with id a21397d02eeb78e503de902aaec3fcf780dbcd76e0af963f0fd72434dd3e901b Apr 24 21:27:29.641964 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:29.641943 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4fcab00268e8e626e83b344c210c25fb.slice/crio-f960b5e79f55e54a0c1bb69d2a21f54448f268f068a5d9698ee39bfc949b04f6 WatchSource:0}: Error finding container f960b5e79f55e54a0c1bb69d2a21f54448f268f068a5d9698ee39bfc949b04f6: Status 404 returned error can't find the container with id f960b5e79f55e54a0c1bb69d2a21f54448f268f068a5d9698ee39bfc949b04f6 Apr 24 21:27:29.645141 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.645128 2580 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-5mfpw" Apr 24 21:27:29.645300 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.645282 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:27:29.686790 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.686739 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" event={"ID":"4fcab00268e8e626e83b344c210c25fb","Type":"ContainerStarted","Data":"f960b5e79f55e54a0c1bb69d2a21f54448f268f068a5d9698ee39bfc949b04f6"} Apr 24 21:27:29.687777 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.687755 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" event={"ID":"baed0c904fe265b9d7df8c0cd38921b3","Type":"ContainerStarted","Data":"a21397d02eeb78e503de902aaec3fcf780dbcd76e0af963f0fd72434dd3e901b"} Apr 24 21:27:29.688865 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.688849 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.789366 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:29.789331 2580 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-36.ec2.internal\" not found" Apr 24 21:27:29.814162 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.814135 2580 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:29.845176 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.845154 2580 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:29.890268 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.890189 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" Apr 24 21:27:29.908682 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.908662 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:29.910133 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.910119 2580 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" Apr 24 21:27:29.917808 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:29.917792 2580 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 24 21:27:30.504941 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.504709 2580 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:30.569093 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.569061 2580 apiserver.go:52] "Watching apiserver" Apr 24 21:27:30.574976 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.574953 2580 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 24 21:27:30.576615 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.576590 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-rvnzj","kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx","openshift-image-registry/node-ca-pzhk2","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal","openshift-network-diagnostics/network-check-target-7k8nd","openshift-ovn-kubernetes/ovnkube-node-bt4zz","openshift-cluster-node-tuning-operator/tuned-nb65j","openshift-multus/multus-additional-cni-plugins-6hjtw","openshift-multus/multus-bvrnt","openshift-multus/network-metrics-daemon-c8k6b","openshift-network-operator/iptables-alerter-zx9c4"] Apr 24 21:27:30.581712 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.581686 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.582048 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.581892 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:30.582048 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.581978 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:30.583983 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.583962 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 24 21:27:30.584103 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.584056 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 24 21:27:30.584103 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.584064 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-zgmsz\"" Apr 24 21:27:30.584333 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.584316 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.584857 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.584840 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.587860 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.587838 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 24 21:27:30.588126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.588110 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.588366 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.588346 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 24 21:27:30.590246 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.588859 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-ncv4l\"" Apr 24 21:27:30.590246 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.588890 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 24 21:27:30.590246 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.589117 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 24 21:27:30.590246 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.589125 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 24 21:27:30.591691 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.591668 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.591829 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.591807 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:30.591916 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.591895 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:30.593917 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.593899 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.594022 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.593937 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:30.594022 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.593956 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-dcxfz\"" Apr 24 21:27:30.594183 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.594160 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.596203 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.596176 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 24 21:27:30.596292 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.596233 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-wv7f2\"" Apr 24 21:27:30.596481 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.596464 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 24 21:27:30.598830 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.598766 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.598914 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.598887 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.600856 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.600837 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 24 21:27:30.600957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.600939 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-sp2qp\"" Apr 24 21:27:30.601179 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601159 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.601256 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601165 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 24 21:27:30.601256 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601220 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 24 21:27:30.601256 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601234 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-5gw7j\"" Apr 24 21:27:30.601752 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601456 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.601752 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601535 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 24 21:27:30.601752 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601549 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.601752 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601655 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 24 21:27:30.601752 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.601721 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 24 21:27:30.603497 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603479 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-socket-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.603587 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-registration-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.603587 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-var-lib-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.603587 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603564 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-etc-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.603738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603611 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysconfig\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.603738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603659 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-lib-modules\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.603738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-cni-netd\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.603738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-etc-selinux\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603751 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-sys-fs\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603775 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bvfj7\" (UniqueName: \"kubernetes.io/projected/cd4045ec-5e4a-42d6-b250-b2496d61e50b-kube-api-access-bvfj7\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603801 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603801 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-ovn\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603825 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-run\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603838 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69ccf04c-a12c-4679-aca5-882d98643f14-etc-tuned\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603863 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-device-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.603931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-cni-bin\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603946 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovnkube-config\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.603969 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysctl-d\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604006 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysctl-conf\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604030 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jmlmj\" (UniqueName: \"kubernetes.io/projected/69ccf04c-a12c-4679-aca5-882d98643f14-kube-api-access-jmlmj\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604054 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-run-netns\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604075 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-log-socket\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604097 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-systemd\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604121 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-sys\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7cm8r\" (UniqueName: \"kubernetes.io/projected/dff89703-eb5c-40dd-b22c-a598308414bc-kube-api-access-7cm8r\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604197 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604242 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-node-log\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604305 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604339 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-modprobe-d\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.604363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604364 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-kubelet\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604395 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-systemd\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604406 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604424 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604473 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovn-node-metrics-cert\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604497 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kj6mp\" (UniqueName: \"kubernetes.io/projected/33177b79-148a-414b-a4ea-c1c5c4ff4faf-kube-api-access-kj6mp\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604530 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-kubernetes\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604557 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-host\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604592 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69ccf04c-a12c-4679-aca5-882d98643f14-tmp\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604608 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-jb9zv\"" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604614 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-slash\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604638 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-env-overrides\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604693 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604717 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604738 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-systemd-units\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604760 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovnkube-script-lib\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.605008 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.604774 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-var-lib-kubelet\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.606177 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.606160 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 24 21:27:30.606517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.606354 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 24 21:27:30.606517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.606354 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-s8f2n\"" Apr 24 21:27:30.606517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.606479 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:27:30.646798 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.646763 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:29 +0000 UTC" deadline="2027-11-18 12:52:35.405516386 +0000 UTC" Apr 24 21:27:30.646798 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.646798 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13743h25m4.758723142s" Apr 24 21:27:30.691463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.691434 2580 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 24 21:27:30.705418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705392 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-node-log\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705420 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-host\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705457 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dpr8r\" (UniqueName: \"kubernetes.io/projected/a365012f-d4d3-4589-b859-78a30ad0e411-kube-api-access-dpr8r\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705477 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-slash\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705508 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-node-log\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705525 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-cnibin\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705552 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-host\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705508 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705575 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.705604 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705598 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-slash\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705615 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-systemd-units\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-var-lib-kubelet\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705663 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-var-lib-kubelet\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705662 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-conf-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-systemd-units\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-socket-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705714 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysconfig\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705729 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-lib-modules\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705756 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b026702e-c0cf-4243-981d-4f64cfc8b0a0-host\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705775 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-cni-netd\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705799 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-etc-selinux\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705799 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-socket-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705817 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-run\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705834 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69ccf04c-a12c-4679-aca5-882d98643f14-etc-tuned\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705832 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-cni-netd\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.705876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705855 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-etc-selinux\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705860 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-lib-modules\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705796 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysconfig\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705877 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-run\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705899 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9f68a401-0090-4e11-a8d5-8ba136fddbec-agent-certs\") pod \"konnectivity-agent-rvnzj\" (UID: \"9f68a401-0090-4e11-a8d5-8ba136fddbec\") " pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705919 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-device-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705940 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-cni-bin\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705964 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovnkube-config\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.705977 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-device-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706007 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-cni-bin\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706021 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysctl-d\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706042 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-run-netns\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706057 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-systemd\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706088 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-sys\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706110 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-run-netns\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-socket-dir-parent\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706136 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysctl-d\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706140 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-k8s-cni-cncf-io\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.706477 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706144 2580 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706160 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-sys\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706163 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-systemd\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706173 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706222 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-modprobe-d\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706385 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-cni-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706414 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95d3f7fe-3212-4144-bef0-8f34cd69da83-cni-binary-copy\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706454 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-daemon-config\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706498 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706509 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-modprobe-d\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706543 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovnkube-config\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706534 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-netns\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-kubelet\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-systemd\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706680 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-kubelet\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.707262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovn-node-metrics-cert\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706725 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kj6mp\" (UniqueName: \"kubernetes.io/projected/33177b79-148a-414b-a4ea-c1c5c4ff4faf-kube-api-access-kj6mp\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706751 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-kubernetes\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706768 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-systemd\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706788 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69ccf04c-a12c-4679-aca5-882d98643f14-tmp\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706815 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-kubelet\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706823 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-kubernetes\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706841 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-env-overrides\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706878 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706903 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-os-release\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.706929 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-cni-bin\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707016 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a365012f-d4d3-4589-b859-78a30ad0e411-iptables-alerter-script\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.707042 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.707123 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.207103768 +0000 UTC m=+3.005166720 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707045 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a365012f-d4d3-4589-b859-78a30ad0e411-host-slash\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707174 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b026702e-c0cf-4243-981d-4f64cfc8b0a0-serviceca\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707200 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-os-release\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.708132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707228 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovnkube-script-lib\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707253 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-cni-multus\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707277 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707303 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ctlv4\" (UniqueName: \"kubernetes.io/projected/5a0ab547-ecd2-4df3-9477-0144645571ea-kube-api-access-ctlv4\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-env-overrides\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707343 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-registration-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707370 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-var-lib-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707397 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-etc-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707422 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-hostroot\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707440 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-registration-dir\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707445 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-multus-certs\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-etc-kubernetes\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707486 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-etc-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707497 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707521 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-var-lib-openvswitch\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707526 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-sys-fs\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.708893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707558 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bvfj7\" (UniqueName: \"kubernetes.io/projected/cd4045ec-5e4a-42d6-b250-b2496d61e50b-kube-api-access-bvfj7\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707583 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-ovn\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707608 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-system-cni-dir\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707628 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/cd4045ec-5e4a-42d6-b250-b2496d61e50b-sys-fs\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707638 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysctl-conf\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jmlmj\" (UniqueName: \"kubernetes.io/projected/69ccf04c-a12c-4679-aca5-882d98643f14-kube-api-access-jmlmj\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5rhk\" (UniqueName: \"kubernetes.io/projected/95d3f7fe-3212-4144-bef0-8f34cd69da83-kube-api-access-t5rhk\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707753 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9f68a401-0090-4e11-a8d5-8ba136fddbec-konnectivity-ca\") pod \"konnectivity-agent-rvnzj\" (UID: \"9f68a401-0090-4e11-a8d5-8ba136fddbec\") " pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707777 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lhgbp\" (UniqueName: \"kubernetes.io/projected/b026702e-c0cf-4243-981d-4f64cfc8b0a0-kube-api-access-lhgbp\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707799 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-cnibin\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707802 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovnkube-script-lib\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707843 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707869 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-log-socket\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707911 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7cm8r\" (UniqueName: \"kubernetes.io/projected/dff89703-eb5c-40dd-b22c-a598308414bc-kube-api-access-7cm8r\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707927 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/69ccf04c-a12c-4679-aca5-882d98643f14-etc-sysctl-conf\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-system-cni-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.709517 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.707968 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-host-run-ovn-kubernetes\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.710017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.708038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-log-socket\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.710017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.708045 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/33177b79-148a-414b-a4ea-c1c5c4ff4faf-run-ovn\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.710017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.709634 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/69ccf04c-a12c-4679-aca5-882d98643f14-etc-tuned\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.710017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.709674 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/69ccf04c-a12c-4679-aca5-882d98643f14-tmp\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.710017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.709980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/33177b79-148a-414b-a4ea-c1c5c4ff4faf-ovn-node-metrics-cert\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.717722 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.717703 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:30.717722 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.717723 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:30.717867 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.717733 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r962z for pod openshift-network-diagnostics/network-check-target-7k8nd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.717867 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:30.717798 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z podName:9a4e6661-523d-4d4f-bd08-473deabd33c0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:31.217785385 +0000 UTC m=+3.015848336 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-r962z" (UniqueName: "kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z") pod "network-check-target-7k8nd" (UID: "9a4e6661-523d-4d4f-bd08-473deabd33c0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:30.720489 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.720466 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kj6mp\" (UniqueName: \"kubernetes.io/projected/33177b79-148a-414b-a4ea-c1c5c4ff4faf-kube-api-access-kj6mp\") pod \"ovnkube-node-bt4zz\" (UID: \"33177b79-148a-414b-a4ea-c1c5c4ff4faf\") " pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.721760 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.721732 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7cm8r\" (UniqueName: \"kubernetes.io/projected/dff89703-eb5c-40dd-b22c-a598308414bc-kube-api-access-7cm8r\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:30.722393 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.722154 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jmlmj\" (UniqueName: \"kubernetes.io/projected/69ccf04c-a12c-4679-aca5-882d98643f14-kube-api-access-jmlmj\") pod \"tuned-nb65j\" (UID: \"69ccf04c-a12c-4679-aca5-882d98643f14\") " pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.722393 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.722273 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bvfj7\" (UniqueName: \"kubernetes.io/projected/cd4045ec-5e4a-42d6-b250-b2496d61e50b-kube-api-access-bvfj7\") pod \"aws-ebs-csi-driver-node-lscsx\" (UID: \"cd4045ec-5e4a-42d6-b250-b2496d61e50b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.808488 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808381 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-socket-dir-parent\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808488 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808422 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-k8s-cni-cncf-io\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808488 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808452 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-cni-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808488 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808476 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95d3f7fe-3212-4144-bef0-8f34cd69da83-cni-binary-copy\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808501 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-daemon-config\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808530 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808555 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-netns\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808595 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-kubelet\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808631 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-os-release\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808656 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-cni-bin\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808687 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a365012f-d4d3-4589-b859-78a30ad0e411-iptables-alerter-script\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808710 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a365012f-d4d3-4589-b859-78a30ad0e411-host-slash\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808735 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b026702e-c0cf-4243-981d-4f64cfc8b0a0-serviceca\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.808772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-os-release\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808781 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-cni-multus\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808802 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808831 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ctlv4\" (UniqueName: \"kubernetes.io/projected/5a0ab547-ecd2-4df3-9477-0144645571ea-kube-api-access-ctlv4\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-kubelet\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808924 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-socket-dir-parent\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-os-release\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.808980 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-k8s-cni-cncf-io\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809038 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-cni-bin\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809239 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809146 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-cni-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809356 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-var-lib-cni-multus\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809456 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-os-release\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809491 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a365012f-d4d3-4589-b859-78a30ad0e411-host-slash\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809511 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-daemon-config\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809519 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-hostroot\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809538 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-multus-certs\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809537 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/95d3f7fe-3212-4144-bef0-8f34cd69da83-cni-binary-copy\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809553 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-etc-kubernetes\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809569 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-hostroot\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809586 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-etc-kubernetes\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809589 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-multus-certs\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809606 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809626 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-system-cni-dir\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.809649 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5rhk\" (UniqueName: \"kubernetes.io/projected/95d3f7fe-3212-4144-bef0-8f34cd69da83-kube-api-access-t5rhk\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809664 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9f68a401-0090-4e11-a8d5-8ba136fddbec-konnectivity-ca\") pod \"konnectivity-agent-rvnzj\" (UID: \"9f68a401-0090-4e11-a8d5-8ba136fddbec\") " pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809682 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809690 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lhgbp\" (UniqueName: \"kubernetes.io/projected/b026702e-c0cf-4243-981d-4f64cfc8b0a0-kube-api-access-lhgbp\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809695 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-system-cni-dir\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-cnibin\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809848 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-cnibin\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809871 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809905 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-system-cni-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809934 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dpr8r\" (UniqueName: \"kubernetes.io/projected/a365012f-d4d3-4589-b859-78a30ad0e411-kube-api-access-dpr8r\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809958 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-cnibin\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809971 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-host-run-netns\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.809985 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-conf-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810030 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b026702e-c0cf-4243-981d-4f64cfc8b0a0-host\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810063 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-system-cni-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9f68a401-0090-4e11-a8d5-8ba136fddbec-agent-certs\") pod \"konnectivity-agent-rvnzj\" (UID: \"9f68a401-0090-4e11-a8d5-8ba136fddbec\") " pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810069 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b026702e-c0cf-4243-981d-4f64cfc8b0a0-host\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810112 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-cnibin\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810137 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/a365012f-d4d3-4589-b859-78a30ad0e411-iptables-alerter-script\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.810948 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810149 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/95d3f7fe-3212-4144-bef0-8f34cd69da83-multus-conf-dir\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.810948 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810235 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/5a0ab547-ecd2-4df3-9477-0144645571ea-tuning-conf-dir\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810948 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810248 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810948 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810276 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/5a0ab547-ecd2-4df3-9477-0144645571ea-cni-binary-copy\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.810948 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.810298 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b026702e-c0cf-4243-981d-4f64cfc8b0a0-serviceca\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.811361 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.811336 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/9f68a401-0090-4e11-a8d5-8ba136fddbec-konnectivity-ca\") pod \"konnectivity-agent-rvnzj\" (UID: \"9f68a401-0090-4e11-a8d5-8ba136fddbec\") " pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.812577 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.812549 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/9f68a401-0090-4e11-a8d5-8ba136fddbec-agent-certs\") pod \"konnectivity-agent-rvnzj\" (UID: \"9f68a401-0090-4e11-a8d5-8ba136fddbec\") " pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.821023 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.820905 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ctlv4\" (UniqueName: \"kubernetes.io/projected/5a0ab547-ecd2-4df3-9477-0144645571ea-kube-api-access-ctlv4\") pod \"multus-additional-cni-plugins-6hjtw\" (UID: \"5a0ab547-ecd2-4df3-9477-0144645571ea\") " pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.821164 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.821145 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5rhk\" (UniqueName: \"kubernetes.io/projected/95d3f7fe-3212-4144-bef0-8f34cd69da83-kube-api-access-t5rhk\") pod \"multus-bvrnt\" (UID: \"95d3f7fe-3212-4144-bef0-8f34cd69da83\") " pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.822479 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.822458 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dpr8r\" (UniqueName: \"kubernetes.io/projected/a365012f-d4d3-4589-b859-78a30ad0e411-kube-api-access-dpr8r\") pod \"iptables-alerter-zx9c4\" (UID: \"a365012f-d4d3-4589-b859-78a30ad0e411\") " pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:30.822572 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.822508 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lhgbp\" (UniqueName: \"kubernetes.io/projected/b026702e-c0cf-4243-981d-4f64cfc8b0a0-kube-api-access-lhgbp\") pod \"node-ca-pzhk2\" (UID: \"b026702e-c0cf-4243-981d-4f64cfc8b0a0\") " pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.841033 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.841012 2580 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 24 21:27:30.895082 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.895056 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" Apr 24 21:27:30.901886 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.901858 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:30.910523 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.910505 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-nb65j" Apr 24 21:27:30.915027 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.915009 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:30.920519 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.920499 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-pzhk2" Apr 24 21:27:30.927078 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.927059 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" Apr 24 21:27:30.933598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.933580 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bvrnt" Apr 24 21:27:30.939100 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:30.939080 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-zx9c4" Apr 24 21:27:31.211866 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.211800 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:31.212012 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:31.211899 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.212012 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:31.211951 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.211937526 +0000 UTC m=+4.010000474 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:31.263834 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.263805 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a0ab547_ecd2_4df3_9477_0144645571ea.slice/crio-93ed876cbf1509b8b073eab7d5370cb8086c51c88cc9c27031f7f2d787f04b54 WatchSource:0}: Error finding container 93ed876cbf1509b8b073eab7d5370cb8086c51c88cc9c27031f7f2d787f04b54: Status 404 returned error can't find the container with id 93ed876cbf1509b8b073eab7d5370cb8086c51c88cc9c27031f7f2d787f04b54 Apr 24 21:27:31.264556 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.264530 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod95d3f7fe_3212_4144_bef0_8f34cd69da83.slice/crio-92336e4598918c9839f139036821d6a661b797980d9284a56a1e930322d4d087 WatchSource:0}: Error finding container 92336e4598918c9839f139036821d6a661b797980d9284a56a1e930322d4d087: Status 404 returned error can't find the container with id 92336e4598918c9839f139036821d6a661b797980d9284a56a1e930322d4d087 Apr 24 21:27:31.265392 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.265357 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod69ccf04c_a12c_4679_aca5_882d98643f14.slice/crio-d0392cae3f647da405851dafb3872c784a90051b5fda9b66c7268335a6087959 WatchSource:0}: Error finding container d0392cae3f647da405851dafb3872c784a90051b5fda9b66c7268335a6087959: Status 404 returned error can't find the container with id d0392cae3f647da405851dafb3872c784a90051b5fda9b66c7268335a6087959 Apr 24 21:27:31.269250 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.269230 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod33177b79_148a_414b_a4ea_c1c5c4ff4faf.slice/crio-2fe2c9d6a63d0a3dd147ebee9db5921ab5ace557dbc22e693dc82d8c4f869b31 WatchSource:0}: Error finding container 2fe2c9d6a63d0a3dd147ebee9db5921ab5ace557dbc22e693dc82d8c4f869b31: Status 404 returned error can't find the container with id 2fe2c9d6a63d0a3dd147ebee9db5921ab5ace557dbc22e693dc82d8c4f869b31 Apr 24 21:27:31.270086 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.270066 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda365012f_d4d3_4589_b859_78a30ad0e411.slice/crio-bf4406466256aa043a01eff43872e29598d5db6c57552ab53b7707e9c15a6744 WatchSource:0}: Error finding container bf4406466256aa043a01eff43872e29598d5db6c57552ab53b7707e9c15a6744: Status 404 returned error can't find the container with id bf4406466256aa043a01eff43872e29598d5db6c57552ab53b7707e9c15a6744 Apr 24 21:27:31.271817 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.271794 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podcd4045ec_5e4a_42d6_b250_b2496d61e50b.slice/crio-a99c031fa0b4aca1c378f3aac832773b07da1eb4e391f7d266b02db793efc568 WatchSource:0}: Error finding container a99c031fa0b4aca1c378f3aac832773b07da1eb4e391f7d266b02db793efc568: Status 404 returned error can't find the container with id a99c031fa0b4aca1c378f3aac832773b07da1eb4e391f7d266b02db793efc568 Apr 24 21:27:31.272637 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.272603 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb026702e_c0cf_4243_981d_4f64cfc8b0a0.slice/crio-94bc8f50178e18c049ee6d064a5bdbf0aee5ee9bb00a4ff4b964bb03bb93dcd3 WatchSource:0}: Error finding container 94bc8f50178e18c049ee6d064a5bdbf0aee5ee9bb00a4ff4b964bb03bb93dcd3: Status 404 returned error can't find the container with id 94bc8f50178e18c049ee6d064a5bdbf0aee5ee9bb00a4ff4b964bb03bb93dcd3 Apr 24 21:27:31.273382 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:27:31.273259 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9f68a401_0090_4e11_a8d5_8ba136fddbec.slice/crio-1aa9e7b9eef4aa90c9aa47808409a481dbe1b1435247e79631af5ff1476a047d WatchSource:0}: Error finding container 1aa9e7b9eef4aa90c9aa47808409a481dbe1b1435247e79631af5ff1476a047d: Status 404 returned error can't find the container with id 1aa9e7b9eef4aa90c9aa47808409a481dbe1b1435247e79631af5ff1476a047d Apr 24 21:27:31.312069 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.312048 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:31.312563 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:31.312201 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:31.312563 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:31.312221 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:31.312563 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:31.312234 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r962z for pod openshift-network-diagnostics/network-check-target-7k8nd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.312563 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:31.312291 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z podName:9a4e6661-523d-4d4f-bd08-473deabd33c0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:32.3122736 +0000 UTC m=+4.110336552 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-r962z" (UniqueName: "kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z") pod "network-check-target-7k8nd" (UID: "9a4e6661-523d-4d4f-bd08-473deabd33c0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:31.647626 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.647510 2580 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-23 21:22:29 +0000 UTC" deadline="2027-10-28 22:46:37.737883978 +0000 UTC" Apr 24 21:27:31.647626 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.647549 2580 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13249h19m6.090338619s" Apr 24 21:27:31.694777 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.694717 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rvnzj" event={"ID":"9f68a401-0090-4e11-a8d5-8ba136fddbec","Type":"ContainerStarted","Data":"1aa9e7b9eef4aa90c9aa47808409a481dbe1b1435247e79631af5ff1476a047d"} Apr 24 21:27:31.699813 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.699759 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" event={"ID":"cd4045ec-5e4a-42d6-b250-b2496d61e50b","Type":"ContainerStarted","Data":"a99c031fa0b4aca1c378f3aac832773b07da1eb4e391f7d266b02db793efc568"} Apr 24 21:27:31.705430 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.705364 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"2fe2c9d6a63d0a3dd147ebee9db5921ab5ace557dbc22e693dc82d8c4f869b31"} Apr 24 21:27:31.707764 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.707704 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerStarted","Data":"93ed876cbf1509b8b073eab7d5370cb8086c51c88cc9c27031f7f2d787f04b54"} Apr 24 21:27:31.715422 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.715397 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pzhk2" event={"ID":"b026702e-c0cf-4243-981d-4f64cfc8b0a0","Type":"ContainerStarted","Data":"94bc8f50178e18c049ee6d064a5bdbf0aee5ee9bb00a4ff4b964bb03bb93dcd3"} Apr 24 21:27:31.725435 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.725404 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zx9c4" event={"ID":"a365012f-d4d3-4589-b859-78a30ad0e411","Type":"ContainerStarted","Data":"bf4406466256aa043a01eff43872e29598d5db6c57552ab53b7707e9c15a6744"} Apr 24 21:27:31.730669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.730418 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nb65j" event={"ID":"69ccf04c-a12c-4679-aca5-882d98643f14","Type":"ContainerStarted","Data":"d0392cae3f647da405851dafb3872c784a90051b5fda9b66c7268335a6087959"} Apr 24 21:27:31.735744 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.733097 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bvrnt" event={"ID":"95d3f7fe-3212-4144-bef0-8f34cd69da83","Type":"ContainerStarted","Data":"92336e4598918c9839f139036821d6a661b797980d9284a56a1e930322d4d087"} Apr 24 21:27:31.742592 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.742565 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" event={"ID":"4fcab00268e8e626e83b344c210c25fb","Type":"ContainerStarted","Data":"dc9d974da9534a8298abf3f099601e75a008e6425d1906490c954354ca15c4e4"} Apr 24 21:27:31.768011 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:31.764523 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-36.ec2.internal" podStartSLOduration=2.764505265 podStartE2EDuration="2.764505265s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:31.764256623 +0000 UTC m=+3.562319594" watchObservedRunningTime="2026-04-24 21:27:31.764505265 +0000 UTC m=+3.562568238" Apr 24 21:27:32.217074 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:32.216965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:32.217229 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.217131 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:32.217229 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.217195 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:34.217176177 +0000 UTC m=+6.015239130 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:32.318191 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:32.318108 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:32.318356 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.318274 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:32.318356 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.318295 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:32.318356 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.318306 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r962z for pod openshift-network-diagnostics/network-check-target-7k8nd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:32.318515 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.318362 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z podName:9a4e6661-523d-4d4f-bd08-473deabd33c0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:34.318345096 +0000 UTC m=+6.116408050 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-r962z" (UniqueName: "kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z") pod "network-check-target-7k8nd" (UID: "9a4e6661-523d-4d4f-bd08-473deabd33c0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:32.685215 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:32.685137 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:32.685215 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:32.685177 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:32.685702 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.685268 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:32.685702 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:32.685387 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:32.748621 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:32.748582 2580 generic.go:358] "Generic (PLEG): container finished" podID="baed0c904fe265b9d7df8c0cd38921b3" containerID="b91f4108cd574c446d9eb5dc202efd4dad53d3d69099dd3303f4072433b53860" exitCode=0 Apr 24 21:27:32.748799 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:32.748698 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" event={"ID":"baed0c904fe265b9d7df8c0cd38921b3","Type":"ContainerDied","Data":"b91f4108cd574c446d9eb5dc202efd4dad53d3d69099dd3303f4072433b53860"} Apr 24 21:27:33.756509 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:33.756430 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" event={"ID":"baed0c904fe265b9d7df8c0cd38921b3","Type":"ContainerStarted","Data":"8009848c8b8a6aa81119413f3aabefb16adebbaa6f4bd37e0386adb2b60238af"} Apr 24 21:27:33.770585 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:33.770536 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-36.ec2.internal" podStartSLOduration=4.770519738 podStartE2EDuration="4.770519738s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:27:33.770051779 +0000 UTC m=+5.568114752" watchObservedRunningTime="2026-04-24 21:27:33.770519738 +0000 UTC m=+5.568582707" Apr 24 21:27:34.235397 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:34.235302 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:34.235548 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.235460 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:34.235548 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.235520 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.235502162 +0000 UTC m=+10.033565110 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:34.335882 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:34.335845 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:34.336076 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.336052 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:34.336076 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.336072 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:34.336190 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.336083 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r962z for pod openshift-network-diagnostics/network-check-target-7k8nd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:34.336190 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.336141 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z podName:9a4e6661-523d-4d4f-bd08-473deabd33c0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:38.336122722 +0000 UTC m=+10.134185677 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-r962z" (UniqueName: "kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z") pod "network-check-target-7k8nd" (UID: "9a4e6661-523d-4d4f-bd08-473deabd33c0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:34.684063 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:34.683861 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:34.684063 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.684044 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:34.684280 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:34.684135 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:34.684280 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:34.684244 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:36.684747 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:36.684713 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:36.685243 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:36.684713 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:36.685243 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:36.684842 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:36.685243 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:36.684927 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:38.265729 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:38.265103 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:38.265729 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.265260 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:38.265729 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.265324 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:27:46.265305307 +0000 UTC m=+18.063368260 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:38.366362 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:38.366332 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:38.366536 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.366460 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:38.366536 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.366473 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:38.366536 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.366482 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r962z for pod openshift-network-diagnostics/network-check-target-7k8nd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:38.366536 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.366526 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z podName:9a4e6661-523d-4d4f-bd08-473deabd33c0 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:46.366512953 +0000 UTC m=+18.164575901 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-r962z" (UniqueName: "kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z") pod "network-check-target-7k8nd" (UID: "9a4e6661-523d-4d4f-bd08-473deabd33c0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:38.687756 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:38.687655 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:38.687898 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.687772 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:38.688154 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:38.687655 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:38.688247 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:38.688185 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:40.684125 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:40.684086 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:40.684549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:40.684086 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:40.684549 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:40.684220 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:40.684549 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:40.684357 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:42.626799 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.626764 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-m4t52"] Apr 24 21:27:42.629761 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.629740 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.629895 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:42.629815 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:42.684291 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.684263 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:42.684469 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:42.684376 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:42.684469 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.684425 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:42.684571 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:42.684530 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:42.696759 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.696726 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/74148c1b-8ecf-4750-b22d-3bb17904b081-dbus\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.696907 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.696772 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.696907 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.696804 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/74148c1b-8ecf-4750-b22d-3bb17904b081-kubelet-config\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.798145 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.798109 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/74148c1b-8ecf-4750-b22d-3bb17904b081-dbus\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.798145 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.798160 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.798380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.798191 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/74148c1b-8ecf-4750-b22d-3bb17904b081-kubelet-config\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.798380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.798278 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/74148c1b-8ecf-4750-b22d-3bb17904b081-kubelet-config\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.798380 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:42.798301 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:42.798380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:42.798333 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/74148c1b-8ecf-4750-b22d-3bb17904b081-dbus\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:42.798380 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:42.798382 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret podName:74148c1b-8ecf-4750-b22d-3bb17904b081 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:43.298359139 +0000 UTC m=+15.096422097 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret") pod "global-pull-secret-syncer-m4t52" (UID: "74148c1b-8ecf-4750-b22d-3bb17904b081") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:43.302264 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:43.302230 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:43.302408 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:43.302372 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:43.302468 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:43.302443 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret podName:74148c1b-8ecf-4750-b22d-3bb17904b081 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:44.302423461 +0000 UTC m=+16.100486412 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret") pod "global-pull-secret-syncer-m4t52" (UID: "74148c1b-8ecf-4750-b22d-3bb17904b081") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:44.309374 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:44.309340 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:44.309756 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:44.309495 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:44.309756 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:44.309570 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret podName:74148c1b-8ecf-4750-b22d-3bb17904b081 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:46.309554994 +0000 UTC m=+18.107617947 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret") pod "global-pull-secret-syncer-m4t52" (UID: "74148c1b-8ecf-4750-b22d-3bb17904b081") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:44.684007 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:44.683955 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:44.684007 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:44.684003 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:44.684234 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:44.684037 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:44.684234 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:44.684143 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:44.684234 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:44.684216 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:44.684364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:44.684275 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:46.323620 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:46.323574 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:46.324195 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:46.323632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:46.324195 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.323742 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:46.324195 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.323744 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:46.324195 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.323818 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret podName:74148c1b-8ecf-4750-b22d-3bb17904b081 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:50.323798311 +0000 UTC m=+22.121861277 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret") pod "global-pull-secret-syncer-m4t52" (UID: "74148c1b-8ecf-4750-b22d-3bb17904b081") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:46.324195 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.323839 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.32382886 +0000 UTC m=+34.121891809 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:27:46.424159 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:46.424122 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:46.424313 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.424276 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 24 21:27:46.424313 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.424294 2580 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 24 21:27:46.424313 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.424303 2580 projected.go:194] Error preparing data for projected volume kube-api-access-r962z for pod openshift-network-diagnostics/network-check-target-7k8nd: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:46.424408 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.424356 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z podName:9a4e6661-523d-4d4f-bd08-473deabd33c0 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.424340095 +0000 UTC m=+34.222403064 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-r962z" (UniqueName: "kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z") pod "network-check-target-7k8nd" (UID: "9a4e6661-523d-4d4f-bd08-473deabd33c0") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 24 21:27:46.684115 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:46.684066 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:46.684115 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:46.684086 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:46.684115 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:46.684112 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:46.684401 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.684200 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:46.684401 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.684363 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:46.684560 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:46.684458 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:48.685145 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.684676 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:48.686040 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.684788 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:48.686040 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:48.685239 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:48.686040 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.684818 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:48.686040 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:48.685310 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:48.686040 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:48.685418 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:48.782913 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.782884 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-nb65j" event={"ID":"69ccf04c-a12c-4679-aca5-882d98643f14","Type":"ContainerStarted","Data":"9348c3ea4f6f17e49b95bbdfb0509ba69a0e13bd9b018acfdb2b59ddd3ff395d"} Apr 24 21:27:48.784455 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.784424 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bvrnt" event={"ID":"95d3f7fe-3212-4144-bef0-8f34cd69da83","Type":"ContainerStarted","Data":"e62b6c6ed305b0ae4bbc67fb84c9777d47d7838b484d286fd97972667b913aaa"} Apr 24 21:27:48.785700 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.785675 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-rvnzj" event={"ID":"9f68a401-0090-4e11-a8d5-8ba136fddbec","Type":"ContainerStarted","Data":"44ce1360f07ef85013969ad18548a68dcb0d1a2cab49a2d3479e349aba24d9e0"} Apr 24 21:27:48.787046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.787023 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" event={"ID":"cd4045ec-5e4a-42d6-b250-b2496d61e50b","Type":"ContainerStarted","Data":"9b9bdd57c234fdbe22c881fc08c5f0bf71f199193a8905fe1d6536781922bf84"} Apr 24 21:27:48.789980 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.789959 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:27:48.790367 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790342 2580 generic.go:358] "Generic (PLEG): container finished" podID="33177b79-148a-414b-a4ea-c1c5c4ff4faf" containerID="3829ef43274a8ef656bdd2c354be726f4ddec418860197089c0e295c97e26bb5" exitCode=1 Apr 24 21:27:48.790459 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790407 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"ccbb2475af5049611991299fa85be5ede62bb7298016ead5deeca4e0b7944723"} Apr 24 21:27:48.790459 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790436 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"dcd1eb15158bc1c5938c9d5a911b781f74a0df9fb51ad109c57a7a188020542f"} Apr 24 21:27:48.790459 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790452 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"8861cef1918fcffdda0996b4f84e284632cfb3f8605f4c657651637df119d85a"} Apr 24 21:27:48.790606 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790467 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"f5742a9702a8f0c9d3edaabaabfdbdf1945f53988c77d91358a7538af8506ef1"} Apr 24 21:27:48.790606 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790480 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerDied","Data":"3829ef43274a8ef656bdd2c354be726f4ddec418860197089c0e295c97e26bb5"} Apr 24 21:27:48.790606 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.790498 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"b93137f8fce117ddfab53a0f97b77f37a9bf9de66121a39c5425f31db73ba562"} Apr 24 21:27:48.791905 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.791881 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a0ab547-ecd2-4df3-9477-0144645571ea" containerID="95ce5b1f0236ddbe5cf665e85b53d1f1e6a62739bda1edd7fa84dac59ce333e2" exitCode=0 Apr 24 21:27:48.792016 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.791949 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerDied","Data":"95ce5b1f0236ddbe5cf665e85b53d1f1e6a62739bda1edd7fa84dac59ce333e2"} Apr 24 21:27:48.794699 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.794664 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-pzhk2" event={"ID":"b026702e-c0cf-4243-981d-4f64cfc8b0a0","Type":"ContainerStarted","Data":"8191f281059247a68d0918fcf895f02a1b1ff1e7cd6175d9358e63d7f0d6db01"} Apr 24 21:27:48.798936 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.798882 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-nb65j" podStartSLOduration=4.05280368 podStartE2EDuration="20.798868961s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.26771963 +0000 UTC m=+3.065782592" lastFinishedPulling="2026-04-24 21:27:48.013784923 +0000 UTC m=+19.811847873" observedRunningTime="2026-04-24 21:27:48.798760773 +0000 UTC m=+20.596823767" watchObservedRunningTime="2026-04-24 21:27:48.798868961 +0000 UTC m=+20.596931932" Apr 24 21:27:48.812856 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.812816 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-rvnzj" podStartSLOduration=3.431795136 podStartE2EDuration="19.812804721s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.275886588 +0000 UTC m=+3.073949536" lastFinishedPulling="2026-04-24 21:27:47.656896168 +0000 UTC m=+19.454959121" observedRunningTime="2026-04-24 21:27:48.812238566 +0000 UTC m=+20.610301537" watchObservedRunningTime="2026-04-24 21:27:48.812804721 +0000 UTC m=+20.610867691" Apr 24 21:27:48.854587 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.854537 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bvrnt" podStartSLOduration=3.071621502 podStartE2EDuration="19.854522396s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.266205079 +0000 UTC m=+3.064268040" lastFinishedPulling="2026-04-24 21:27:48.049105982 +0000 UTC m=+19.847168934" observedRunningTime="2026-04-24 21:27:48.854221625 +0000 UTC m=+20.652284586" watchObservedRunningTime="2026-04-24 21:27:48.854522396 +0000 UTC m=+20.652585368" Apr 24 21:27:48.870656 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:48.870621 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-pzhk2" podStartSLOduration=3.131589908 podStartE2EDuration="19.870608639s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.274765243 +0000 UTC m=+3.072828191" lastFinishedPulling="2026-04-24 21:27:48.013783961 +0000 UTC m=+19.811846922" observedRunningTime="2026-04-24 21:27:48.870353294 +0000 UTC m=+20.668416274" watchObservedRunningTime="2026-04-24 21:27:48.870608639 +0000 UTC m=+20.668671609" Apr 24 21:27:49.515531 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.515343 2580 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 24 21:27:49.666285 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.666173 2580 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-24T21:27:49.515526971Z","UUID":"e7195e75-aaee-47cd-9d13-7a6c9db99f96","Handler":null,"Name":"","Endpoint":""} Apr 24 21:27:49.668058 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.667985 2580 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 24 21:27:49.668058 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.668028 2580 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 24 21:27:49.798525 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.798487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" event={"ID":"cd4045ec-5e4a-42d6-b250-b2496d61e50b","Type":"ContainerStarted","Data":"8626f78dce67359fe85433454e49690960ff6e46f78d037b8ca7ee6c1c33c6b6"} Apr 24 21:27:49.800064 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.800025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-zx9c4" event={"ID":"a365012f-d4d3-4589-b859-78a30ad0e411","Type":"ContainerStarted","Data":"f803176fa5d8b22e2d082a4b30e825e6b7fe9c8e5a446bc1fc0018a0e6d28213"} Apr 24 21:27:49.814769 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.814721 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-zx9c4" podStartSLOduration=4.100580901 podStartE2EDuration="20.814709347s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.27258494 +0000 UTC m=+3.070647888" lastFinishedPulling="2026-04-24 21:27:47.986713383 +0000 UTC m=+19.784776334" observedRunningTime="2026-04-24 21:27:49.814232238 +0000 UTC m=+21.612295210" watchObservedRunningTime="2026-04-24 21:27:49.814709347 +0000 UTC m=+21.612772352" Apr 24 21:27:49.988779 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.988757 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:49.989386 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:49.989368 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:50.358495 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.358415 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:50.358678 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:50.358553 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:50.358678 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:50.358613 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret podName:74148c1b-8ecf-4750-b22d-3bb17904b081 nodeName:}" failed. No retries permitted until 2026-04-24 21:27:58.358599965 +0000 UTC m=+30.156662913 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret") pod "global-pull-secret-syncer-m4t52" (UID: "74148c1b-8ecf-4750-b22d-3bb17904b081") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:50.684689 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.684635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:50.684875 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.684635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:50.684875 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:50.684762 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:50.684875 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:50.684836 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:50.684875 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.684635 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:50.685098 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:50.684928 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:50.803761 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.803716 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" event={"ID":"cd4045ec-5e4a-42d6-b250-b2496d61e50b","Type":"ContainerStarted","Data":"00e087c782eb6e73a8144ae49cd6acb8cb6d2f9bbdd3620e31cfa780be77eec7"} Apr 24 21:27:50.804415 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.804261 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:50.804645 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.804618 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-rvnzj" Apr 24 21:27:50.836458 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:50.836400 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-lscsx" podStartSLOduration=3.614814088 podStartE2EDuration="22.836383021s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.273765217 +0000 UTC m=+3.071828180" lastFinishedPulling="2026-04-24 21:27:50.49533415 +0000 UTC m=+22.293397113" observedRunningTime="2026-04-24 21:27:50.836276409 +0000 UTC m=+22.634339380" watchObservedRunningTime="2026-04-24 21:27:50.836383021 +0000 UTC m=+22.634445993" Apr 24 21:27:51.808501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:51.808475 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:27:51.809173 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:51.808878 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"ad1891600c6ef43838f63609e437e392a6d68ba7213ef3443b35eaa4cbdec1a6"} Apr 24 21:27:52.684061 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:52.684026 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:52.684257 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:52.684026 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:52.684257 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:52.684150 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:52.684257 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:52.684163 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:52.684417 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:52.684249 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:52.684417 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:52.684345 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:53.814579 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.814403 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:27:53.815085 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.814900 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"d779e2c021e7e9dc692c4add28cb3fd2e9c4e80a07ef82ad651fd8944b601f19"} Apr 24 21:27:53.815360 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.815291 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:53.815466 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.815450 2580 scope.go:117] "RemoveContainer" containerID="3829ef43274a8ef656bdd2c354be726f4ddec418860197089c0e295c97e26bb5" Apr 24 21:27:53.816586 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.816559 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a0ab547-ecd2-4df3-9477-0144645571ea" containerID="713948d12c10fd07a69faee0494449e635baf71e63a77da831dbaca66e6c7398" exitCode=0 Apr 24 21:27:53.816691 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.816607 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerDied","Data":"713948d12c10fd07a69faee0494449e635baf71e63a77da831dbaca66e6c7398"} Apr 24 21:27:53.832524 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:53.832506 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:54.684029 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.683935 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:54.684029 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.683935 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:54.684214 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:54.684046 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:54.684214 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.683935 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:54.684214 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:54.684110 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:54.684313 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:54.684221 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:54.821272 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.821247 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:27:54.821613 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.821589 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" event={"ID":"33177b79-148a-414b-a4ea-c1c5c4ff4faf","Type":"ContainerStarted","Data":"5df1fbba5064488401193e2581c16dffbfbc347521ea5bde17c42c96291e5afa"} Apr 24 21:27:54.821735 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.821722 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:27:54.821916 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.821894 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:54.823507 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.823482 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a0ab547-ecd2-4df3-9477-0144645571ea" containerID="994a4cc398068e86e5999ecf6c97fa98d5ee81f93d833cc78758e1ae9b9692f4" exitCode=0 Apr 24 21:27:54.823614 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.823523 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerDied","Data":"994a4cc398068e86e5999ecf6c97fa98d5ee81f93d833cc78758e1ae9b9692f4"} Apr 24 21:27:54.836378 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.836359 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:54.859623 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:54.859584 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" podStartSLOduration=9.952028082 podStartE2EDuration="26.85957321s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.271525485 +0000 UTC m=+3.069588433" lastFinishedPulling="2026-04-24 21:27:48.179070613 +0000 UTC m=+19.977133561" observedRunningTime="2026-04-24 21:27:54.859023246 +0000 UTC m=+26.657086217" watchObservedRunningTime="2026-04-24 21:27:54.85957321 +0000 UTC m=+26.657636179" Apr 24 21:27:55.011674 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.011583 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c8k6b"] Apr 24 21:27:55.011821 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.011684 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:55.011821 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:55.011772 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:55.014935 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.014913 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m4t52"] Apr 24 21:27:55.015054 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.014986 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:55.015099 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:55.015079 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:55.017026 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.017005 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7k8nd"] Apr 24 21:27:55.017098 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.017070 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:55.017151 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:55.017136 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:55.827792 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.827710 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a0ab547-ecd2-4df3-9477-0144645571ea" containerID="482199c5a3b29738031bba16b945c28d3e5206076b97b69e39c95896be0fdc3d" exitCode=0 Apr 24 21:27:55.828209 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.827787 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerDied","Data":"482199c5a3b29738031bba16b945c28d3e5206076b97b69e39c95896be0fdc3d"} Apr 24 21:27:55.828209 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:55.828021 2580 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 24 21:27:56.400046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:56.400012 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:27:56.684636 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:56.684349 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:56.684801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:56.684360 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:56.684801 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:56.684671 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:27:56.684801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:56.684362 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:56.684801 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:56.684749 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:56.685030 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:56.684828 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:58.425119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:58.425089 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:58.425664 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:58.425229 2580 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:58.425664 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:58.425297 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret podName:74148c1b-8ecf-4750-b22d-3bb17904b081 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:14.425279563 +0000 UTC m=+46.223342516 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret") pod "global-pull-secret-syncer-m4t52" (UID: "74148c1b-8ecf-4750-b22d-3bb17904b081") : object "kube-system"/"original-pull-secret" not registered Apr 24 21:27:58.685129 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:58.685048 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:27:58.685129 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:58.685079 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:27:58.685129 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:27:58.684890 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:27:58.685376 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:58.685215 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:27:58.685376 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:58.685351 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:27:58.685596 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:27:58.685573 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:28:00.684706 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:00.684660 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:28:00.685328 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:00.684805 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-m4t52" podUID="74148c1b-8ecf-4750-b22d-3bb17904b081" Apr 24 21:28:00.685328 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:00.684678 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:00.685328 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:00.684899 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-7k8nd" podUID="9a4e6661-523d-4d4f-bd08-473deabd33c0" Apr 24 21:28:00.685328 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:00.684672 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:28:00.685328 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:00.685004 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-c8k6b" podUID="dff89703-eb5c-40dd-b22c-a598308414bc" Apr 24 21:28:01.045576 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.045495 2580 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-36.ec2.internal" event="NodeReady" Apr 24 21:28:01.045730 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.045642 2580 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 24 21:28:01.108888 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.108193 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h"] Apr 24 21:28:01.137621 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.137573 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-6b4d778455-w6fvh"] Apr 24 21:28:01.137784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.137734 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" Apr 24 21:28:01.161277 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.161253 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.162029 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.161981 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.162784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.162759 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7"] Apr 24 21:28:01.163141 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.162957 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.163959 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.163934 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-9g66z\"" Apr 24 21:28:01.165902 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.165862 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6crsc\"" Apr 24 21:28:01.165902 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.165870 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 24 21:28:01.166141 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.165938 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 24 21:28:01.166235 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.166150 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 24 21:28:01.177481 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.177438 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 24 21:28:01.178047 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.177989 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz"] Apr 24 21:28:01.178148 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.178126 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.181387 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.181361 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.181473 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.181428 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.181559 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.181547 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 24 21:28:01.181632 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.181567 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bh9nr\"" Apr 24 21:28:01.181632 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.181579 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 24 21:28:01.195946 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.195923 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7f4c79d4bd-dxrjx"] Apr 24 21:28:01.196062 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.196045 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.199485 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.199447 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.199485 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.199480 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.199656 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.199614 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-95fs2\"" Apr 24 21:28:01.199723 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.199696 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 24 21:28:01.215533 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.215507 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f"] Apr 24 21:28:01.215831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.215811 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.218710 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.218616 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 24 21:28:01.218710 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.218624 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hhxk7\"" Apr 24 21:28:01.219190 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.218723 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 24 21:28:01.219190 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.218745 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.219190 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.219137 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 24 21:28:01.219190 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.219161 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 24 21:28:01.219538 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.219518 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.231218 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.231194 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zxbmp"] Apr 24 21:28:01.231469 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.231443 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.234006 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.233973 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 24 21:28:01.234115 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.234094 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.234236 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.234104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-2jln7\"" Apr 24 21:28:01.234429 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.234413 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 24 21:28:01.234702 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.234687 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.245240 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245220 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-trusted-ca\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245324 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gdxqd\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-kube-api-access-gdxqd\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245324 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245283 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-59rx5\" (UniqueName: \"kubernetes.io/projected/f161940e-db46-4df4-9316-662b09a296c4-kube-api-access-59rx5\") pod \"volume-data-source-validator-7c6cbb6c87-pl48h\" (UID: \"f161940e-db46-4df4-9316-662b09a296c4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" Apr 24 21:28:01.245324 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245313 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-bound-sa-token\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245482 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245347 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-certificates\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245482 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245416 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-image-registry-private-configuration\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245482 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245441 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245482 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245467 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-ca-trust-extracted\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.245659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.245514 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-installation-pull-secrets\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.255097 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.255076 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5"] Apr 24 21:28:01.255269 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.255246 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.261912 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.261892 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.262172 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.261916 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.262172 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.261921 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-n45c8\"" Apr 24 21:28:01.262172 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.261978 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 24 21:28:01.262172 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.261981 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 24 21:28:01.266878 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.266859 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 24 21:28:01.277172 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.277101 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx"] Apr 24 21:28:01.277277 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.277260 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.280511 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.280494 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-2xg8j\"" Apr 24 21:28:01.280747 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.280674 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.280747 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.280707 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 24 21:28:01.280893 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.280853 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.281306 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.281289 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 24 21:28:01.298620 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.298547 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl"] Apr 24 21:28:01.298726 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.298614 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.303636 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.303614 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-qb4sw\"" Apr 24 21:28:01.303899 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.303720 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 24 21:28:01.303899 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.303731 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 24 21:28:01.303899 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.303881 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.304083 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.303976 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.314358 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.314337 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6"] Apr 24 21:28:01.314463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.314387 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.319149 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.319082 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"work-manager-hub-kubeconfig\"" Apr 24 21:28:01.333044 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.333019 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w6q7n"] Apr 24 21:28:01.333151 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.333116 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.342123 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.342104 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-open-cluster-management.io-proxy-agent-signer-client-cert\"" Apr 24 21:28:01.342246 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.342231 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-ca\"" Apr 24 21:28:01.342434 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.342422 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-service-proxy-server-certificates\"" Apr 24 21:28:01.342812 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.342800 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"cluster-proxy-hub-kubeconfig\"" Apr 24 21:28:01.346065 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346047 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zsz7c\" (UniqueName: \"kubernetes.io/projected/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-kube-api-access-zsz7c\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.346126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346073 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae26ae6-db56-4447-825f-208c0ab19d34-config\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.346126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346090 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae26ae6-db56-4447-825f-208c0ab19d34-serving-cert\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.346126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346115 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae26ae6-db56-4447-825f-208c0ab19d34-trusted-ca\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.346206 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346166 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-59rx5\" (UniqueName: \"kubernetes.io/projected/f161940e-db46-4df4-9316-662b09a296c4-kube-api-access-59rx5\") pod \"volume-data-source-validator-7c6cbb6c87-pl48h\" (UID: \"f161940e-db46-4df4-9316-662b09a296c4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" Apr 24 21:28:01.346245 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346229 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-bound-sa-token\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346280 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.346309 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346295 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6cd08957-865f-4442-98d3-f5cd050c3fb6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.346341 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-certificates\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346379 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346366 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.346421 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346409 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.346461 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-image-registry-private-configuration\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346499 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346476 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346533 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346496 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-ca-trust-extracted\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346533 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346526 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-installation-pull-secrets\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346589 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-default-certificate\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.346589 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346572 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-trusted-ca\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.346589 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.346576 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:01.346589 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.346589 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4d778455-w6fvh: secret "image-registry-tls" not found Apr 24 21:28:01.346694 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346590 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.346694 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346617 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vlk7p\" (UniqueName: \"kubernetes.io/projected/a318551b-2eb1-436a-8979-f4e740c2e662-kube-api-access-vlk7p\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.346870 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.346841 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls podName:4b822c28-ad3a-4754-8c9b-aeaf91af3b98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.846820847 +0000 UTC m=+33.644883812 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls") pod "image-registry-6b4d778455-w6fvh" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98") : secret "image-registry-tls" not found Apr 24 21:28:01.346965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346875 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-config\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.346965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346928 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gdxqd\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-kube-api-access-gdxqd\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.347075 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.346962 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-97bg9\" (UniqueName: \"kubernetes.io/projected/8ae26ae6-db56-4447-825f-208c0ab19d34-kube-api-access-97bg9\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.347075 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.347019 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.347075 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.347053 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q2gnb\" (UniqueName: \"kubernetes.io/projected/0df49e02-8899-47f3-804e-9975c913c649-kube-api-access-q2gnb\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.347220 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.347101 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kt5sp\" (UniqueName: \"kubernetes.io/projected/6cd08957-865f-4442-98d3-f5cd050c3fb6-kube-api-access-kt5sp\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.347220 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.347139 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-stats-auth\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.352432 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-ca-trust-extracted\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.352611 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352581 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h"] Apr 24 21:28:01.352611 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352608 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5"] Apr 24 21:28:01.352768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352623 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz"] Apr 24 21:28:01.352768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352633 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6"] Apr 24 21:28:01.352768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352643 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w6q7n"] Apr 24 21:28:01.352768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352649 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-certificates\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.352768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352656 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm"] Apr 24 21:28:01.352768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352745 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.353094 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.352837 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-trusted-ca\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.356282 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.356263 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-image-registry-private-configuration\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.356411 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.356270 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-installation-pull-secrets\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.366743 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.366720 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.366841 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.366769 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.366841 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.366820 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 24 21:28:01.367225 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.367191 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 24 21:28:01.367472 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.367458 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-2k5sr\"" Apr 24 21:28:01.372296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.372277 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-5gnht"] Apr 24 21:28:01.372431 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.372417 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" Apr 24 21:28:01.372965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.372950 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 24 21:28:01.380367 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.380345 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gdxqd\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-kube-api-access-gdxqd\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.380560 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.380540 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-59rx5\" (UniqueName: \"kubernetes.io/projected/f161940e-db46-4df4-9316-662b09a296c4-kube-api-access-59rx5\") pod \"volume-data-source-validator-7c6cbb6c87-pl48h\" (UID: \"f161940e-db46-4df4-9316-662b09a296c4\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" Apr 24 21:28:01.380633 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.380616 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-bound-sa-token\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.381272 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.381255 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-4bwm5\"" Apr 24 21:28:01.381395 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.381380 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.381499 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.381484 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.388323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388305 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7"] Apr 24 21:28:01.388323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388326 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388335 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388345 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zxbmp"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388353 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5gnht"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388363 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388374 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7f4c79d4bd-dxrjx"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388387 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b4d778455-w6fvh"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388401 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx"] Apr 24 21:28:01.388440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388416 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-fp6xs"] Apr 24 21:28:01.388662 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.388456 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:01.390524 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.390508 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 24 21:28:01.390524 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.390524 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ssv7d\"" Apr 24 21:28:01.390665 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.390544 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.390925 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.390911 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.406007 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.405971 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fp6xs"] Apr 24 21:28:01.406123 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.406109 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.408249 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.408231 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 24 21:28:01.408391 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.408376 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 24 21:28:01.408650 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.408635 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 24 21:28:01.408950 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.408935 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xsnkf\"" Apr 24 21:28:01.409138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.409124 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 24 21:28:01.447989 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.447960 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8xkln\" (UniqueName: \"kubernetes.io/projected/553124ae-a77f-4ced-abab-751764ac01e1-kube-api-access-8xkln\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.448113 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448015 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6cd08957-865f-4442-98d3-f5cd050c3fb6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.448113 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448042 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-tmp\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.448113 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448058 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d41f9802-2338-451e-b318-e3503d861cab-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.448113 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448074 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-snapshots\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.448113 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448094 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448136 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2m8bh\" (UniqueName: \"kubernetes.io/projected/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-kube-api-access-2m8bh\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.448166 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-serving-cert\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.448209 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.948193779 +0000 UTC m=+33.746256727 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : secret "router-metrics-certs-default" not found Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448221 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448239 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1f4c7e1b-114a-4d59-9aed-53e71a116cf9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-865464b9cc-lrxn5\" (UID: \"1f4c7e1b-114a-4d59-9aed-53e71a116cf9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448254 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lrzj4\" (UniqueName: \"kubernetes.io/projected/d41f9802-2338-451e-b318-e3503d861cab-kube-api-access-lrzj4\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.448311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448285 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-config\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448315 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zsz7c\" (UniqueName: \"kubernetes.io/projected/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-kube-api-access-zsz7c\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448360 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4d3d8ce-54d9-4c28-8043-df84ae070d16-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448407 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q2gnb\" (UniqueName: \"kubernetes.io/projected/0df49e02-8899-47f3-804e-9975c913c649-kube-api-access-q2gnb\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448448 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-stats-auth\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448469 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae26ae6-db56-4447-825f-208c0ab19d34-config\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448485 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae26ae6-db56-4447-825f-208c0ab19d34-trusted-ca\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448510 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d41f9802-2338-451e-b318-e3503d861cab-tmp\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448540 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448570 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-hub\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448599 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-service-ca-bundle\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.448659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448639 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448674 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448724 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d3d8ce-54d9-4c28-8043-df84ae070d16-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448786 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae26ae6-db56-4447-825f-208c0ab19d34-serving-cert\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448816 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cnvxv\" (UniqueName: \"kubernetes.io/projected/a4d3d8ce-54d9-4c28-8043-df84ae070d16-kube-api-access-cnvxv\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448867 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-default-certificate\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448901 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vlk7p\" (UniqueName: \"kubernetes.io/projected/a318551b-2eb1-436a-8979-f4e740c2e662-kube-api-access-vlk7p\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448905 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-config\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.448936 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449013 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kt5sp\" (UniqueName: \"kubernetes.io/projected/6cd08957-865f-4442-98d3-f5cd050c3fb6-kube-api-access-kt5sp\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449036 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/6cd08957-865f-4442-98d3-f5cd050c3fb6-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.449047 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-ca\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449094 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fmhq8\" (UniqueName: \"kubernetes.io/projected/1f4c7e1b-114a-4d59-9aed-53e71a116cf9-kube-api-access-fmhq8\") pod \"managed-serviceaccount-addon-agent-865464b9cc-lrxn5\" (UID: \"1f4c7e1b-114a-4d59-9aed-53e71a116cf9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.449180 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.449112 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls podName:0df49e02-8899-47f3-804e-9975c913c649 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.949094492 +0000 UTC m=+33.747157441 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pxvnz" (UID: "0df49e02-8899-47f3-804e-9975c913c649") : secret "samples-operator-tls" not found Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449229 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/8ae26ae6-db56-4447-825f-208c0ab19d34-config\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449319 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.449351 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.449399 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls podName:6cd08957-865f-4442-98d3-f5cd050c3fb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.949384398 +0000 UTC m=+33.747447355 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-542j7" (UID: "6cd08957-865f-4442-98d3-f5cd050c3fb6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449471 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/553124ae-a77f-4ced-abab-751764ac01e1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.449477 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:01.949465886 +0000 UTC m=+33.747528854 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449619 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-97bg9\" (UniqueName: \"kubernetes.io/projected/8ae26ae6-db56-4447-825f-208c0ab19d34-kube-api-access-97bg9\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.449831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.449649 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.451504 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.451476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-serving-cert\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.451624 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.451607 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/8ae26ae6-db56-4447-825f-208c0ab19d34-serving-cert\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.451666 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.451646 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-default-certificate\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.451810 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.451791 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-stats-auth\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.454567 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.454554 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" Apr 24 21:28:01.460210 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.460184 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/8ae26ae6-db56-4447-825f-208c0ab19d34-trusted-ca\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.463779 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.463760 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zsz7c\" (UniqueName: \"kubernetes.io/projected/d1681d40-7ce7-4810-b3ea-1c27861ac3d8-kube-api-access-zsz7c\") pod \"service-ca-operator-d6fc45fc5-4kn9f\" (UID: \"d1681d40-7ce7-4810-b3ea-1c27861ac3d8\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.464907 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.464885 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kt5sp\" (UniqueName: \"kubernetes.io/projected/6cd08957-865f-4442-98d3-f5cd050c3fb6-kube-api-access-kt5sp\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.465954 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.465923 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vlk7p\" (UniqueName: \"kubernetes.io/projected/a318551b-2eb1-436a-8979-f4e740c2e662-kube-api-access-vlk7p\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.466165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.466151 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-97bg9\" (UniqueName: \"kubernetes.io/projected/8ae26ae6-db56-4447-825f-208c0ab19d34-kube-api-access-97bg9\") pod \"console-operator-9d4b6777b-zxbmp\" (UID: \"8ae26ae6-db56-4447-825f-208c0ab19d34\") " pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.471894 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.471872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q2gnb\" (UniqueName: \"kubernetes.io/projected/0df49e02-8899-47f3-804e-9975c913c649-kube-api-access-q2gnb\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.540641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.540373 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" Apr 24 21:28:01.550672 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550580 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8xkln\" (UniqueName: \"kubernetes.io/projected/553124ae-a77f-4ced-abab-751764ac01e1-kube-api-access-8xkln\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.550672 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550645 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-tmp\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.550834 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550674 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d41f9802-2338-451e-b318-e3503d861cab-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.550834 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-snapshots\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.550834 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2m8bh\" (UniqueName: \"kubernetes.io/projected/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-kube-api-access-2m8bh\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.550834 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550779 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-serving-cert\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.550834 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550808 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xznhf\" (UniqueName: \"kubernetes.io/projected/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-kube-api-access-xznhf\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550839 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1f4c7e1b-114a-4d59-9aed-53e71a116cf9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-865464b9cc-lrxn5\" (UID: \"1f4c7e1b-114a-4d59-9aed-53e71a116cf9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550892 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lrzj4\" (UniqueName: \"kubernetes.io/projected/d41f9802-2338-451e-b318-e3503d861cab-kube-api-access-lrzj4\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550936 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-config-volume\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.550968 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5hzz5\" (UniqueName: \"kubernetes.io/projected/3b79033d-a984-4892-b6db-1971346abcd5-kube-api-access-5hzz5\") pod \"network-check-source-8894fc9bd-hwxfm\" (UID: \"3b79033d-a984-4892-b6db-1971346abcd5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.551024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4d3d8ce-54d9-4c28-8043-df84ae070d16-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.551096 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.551060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d41f9802-2338-451e-b318-e3503d861cab-tmp\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.551434 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.551123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-hub\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.551434 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.551194 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-service-ca-bundle\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.551434 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.551226 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:01.551434 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.551264 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-tmp-dir\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.552647 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.552577 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-service-ca-bundle\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.552647 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.552585 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-tmp\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.552903 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.552872 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d41f9802-2338-451e-b318-e3503d861cab-tmp\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553512 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553569 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d3d8ce-54d9-4c28-8043-df84ae070d16-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553633 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cnvxv\" (UniqueName: \"kubernetes.io/projected/a4d3d8ce-54d9-4c28-8043-df84ae070d16-kube-api-access-cnvxv\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553698 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-ca\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553727 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fmhq8\" (UniqueName: \"kubernetes.io/projected/1f4c7e1b-114a-4d59-9aed-53e71a116cf9-kube-api-access-fmhq8\") pod \"managed-serviceaccount-addon-agent-865464b9cc-lrxn5\" (UID: \"1f4c7e1b-114a-4d59-9aed-53e71a116cf9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553767 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/553124ae-a77f-4ced-abab-751764ac01e1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553806 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553833 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jqkw7\" (UniqueName: \"kubernetes.io/projected/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-kube-api-access-jqkw7\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553865 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.554876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.553964 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-snapshots\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.555450 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.555294 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/a4d3d8ce-54d9-4c28-8043-df84ae070d16-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.557306 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.557284 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-hub\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.558104 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.558077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/a4d3d8ce-54d9-4c28-8043-df84ae070d16-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.558265 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.558224 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-serving-cert\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.558641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.558497 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"klusterlet-config\" (UniqueName: \"kubernetes.io/secret/d41f9802-2338-451e-b318-e3503d861cab-klusterlet-config\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.559430 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.559384 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.559520 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.559456 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-hub-kubeconfig\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.559876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.559840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-ca\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.559941 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.559852 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ocpservice-ca\" (UniqueName: \"kubernetes.io/configmap/553124ae-a77f-4ced-abab-751764ac01e1-ocpservice-ca\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.560284 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.560058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/1f4c7e1b-114a-4d59-9aed-53e71a116cf9-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-865464b9cc-lrxn5\" (UID: \"1f4c7e1b-114a-4d59-9aed-53e71a116cf9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.560284 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.560149 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-proxy-server-cert\" (UniqueName: \"kubernetes.io/secret/553124ae-a77f-4ced-abab-751764ac01e1-service-proxy-server-cert\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.564987 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.564898 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8xkln\" (UniqueName: \"kubernetes.io/projected/553124ae-a77f-4ced-abab-751764ac01e1-kube-api-access-8xkln\") pod \"cluster-proxy-proxy-agent-66f65d7c76-zsch6\" (UID: \"553124ae-a77f-4ced-abab-751764ac01e1\") " pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.565147 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.565136 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:01.567450 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.567424 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2m8bh\" (UniqueName: \"kubernetes.io/projected/fc9822dc-9a3d-4fdf-94b1-053fb0f0608b-kube-api-access-2m8bh\") pod \"insights-operator-585dfdc468-w6q7n\" (UID: \"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b\") " pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.568321 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.568288 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lrzj4\" (UniqueName: \"kubernetes.io/projected/d41f9802-2338-451e-b318-e3503d861cab-kube-api-access-lrzj4\") pod \"klusterlet-addon-workmgr-7c9cc58456-r64pl\" (UID: \"d41f9802-2338-451e-b318-e3503d861cab\") " pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.568692 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.568657 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fmhq8\" (UniqueName: \"kubernetes.io/projected/1f4c7e1b-114a-4d59-9aed-53e71a116cf9-kube-api-access-fmhq8\") pod \"managed-serviceaccount-addon-agent-865464b9cc-lrxn5\" (UID: \"1f4c7e1b-114a-4d59-9aed-53e71a116cf9\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.575317 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.575290 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cnvxv\" (UniqueName: \"kubernetes.io/projected/a4d3d8ce-54d9-4c28-8043-df84ae070d16-kube-api-access-cnvxv\") pod \"kube-storage-version-migrator-operator-6769c5d45-74bgx\" (UID: \"a4d3d8ce-54d9-4c28-8043-df84ae070d16\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.601606 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.601575 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" Apr 24 21:28:01.606622 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.606565 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h"] Apr 24 21:28:01.610281 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.610263 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" Apr 24 21:28:01.612187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.612168 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-trd5d"] Apr 24 21:28:01.625433 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.625410 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:01.627355 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.627340 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.629984 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.629965 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-wwmb2\"" Apr 24 21:28:01.641236 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.641217 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:28:01.654967 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.654949 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-config-volume\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.655087 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.654978 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5hzz5\" (UniqueName: \"kubernetes.io/projected/3b79033d-a984-4892-b6db-1971346abcd5-kube-api-access-5hzz5\") pod \"network-check-source-8894fc9bd-hwxfm\" (UID: \"3b79033d-a984-4892-b6db-1971346abcd5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" Apr 24 21:28:01.655087 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655023 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:01.655168 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.655095 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:01.655168 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.655134 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert podName:545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.155121896 +0000 UTC m=+33.953184844 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert") pod "ingress-canary-5gnht" (UID: "545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804") : secret "canary-serving-cert" not found Apr 24 21:28:01.655168 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655145 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-tmp-dir\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.655298 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655243 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.655298 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655262 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jqkw7\" (UniqueName: \"kubernetes.io/projected/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-kube-api-access-jqkw7\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.655468 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655312 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xznhf\" (UniqueName: \"kubernetes.io/projected/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-kube-api-access-xznhf\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:01.655511 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.655444 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:01.655582 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.655519 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls podName:78c8f2d7-06fd-41fd-91a1-02af0d79bea4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.155499314 +0000 UTC m=+33.953562277 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls") pod "dns-default-fp6xs" (UID: "78c8f2d7-06fd-41fd-91a1-02af0d79bea4") : secret "dns-default-metrics-tls" not found Apr 24 21:28:01.655582 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655563 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-config-volume\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.655582 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.655570 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-tmp-dir\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.670783 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.670753 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f"] Apr 24 21:28:01.672242 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.672223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jqkw7\" (UniqueName: \"kubernetes.io/projected/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-kube-api-access-jqkw7\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:01.673679 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.673663 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" Apr 24 21:28:01.675397 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.675380 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xznhf\" (UniqueName: \"kubernetes.io/projected/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-kube-api-access-xznhf\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:01.675985 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.675969 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5hzz5\" (UniqueName: \"kubernetes.io/projected/3b79033d-a984-4892-b6db-1971346abcd5-kube-api-access-5hzz5\") pod \"network-check-source-8894fc9bd-hwxfm\" (UID: \"3b79033d-a984-4892-b6db-1971346abcd5\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" Apr 24 21:28:01.680759 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.680741 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" Apr 24 21:28:01.712074 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:01.712039 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf161940e_db46_4df4_9316_662b09a296c4.slice/crio-5d265d6068a4430600d8fa4f6ef746ec7f467911c90a1f45ec3416e03bf59186 WatchSource:0}: Error finding container 5d265d6068a4430600d8fa4f6ef746ec7f467911c90a1f45ec3416e03bf59186: Status 404 returned error can't find the container with id 5d265d6068a4430600d8fa4f6ef746ec7f467911c90a1f45ec3416e03bf59186 Apr 24 21:28:01.712669 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:01.712453 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1681d40_7ce7_4810_b3ea_1c27861ac3d8.slice/crio-078ec40de55e6d74d50a7a7aa6824c3dfe6cebe0e52d7951923606af01051f0c WatchSource:0}: Error finding container 078ec40de55e6d74d50a7a7aa6824c3dfe6cebe0e52d7951923606af01051f0c: Status 404 returned error can't find the container with id 078ec40de55e6d74d50a7a7aa6824c3dfe6cebe0e52d7951923606af01051f0c Apr 24 21:28:01.756270 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.756241 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa996ab4-6084-48e7-92d1-518c14773d43-hosts-file\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.756370 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.756318 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa996ab4-6084-48e7-92d1-518c14773d43-tmp-dir\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.756463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.756448 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8w8t\" (UniqueName: \"kubernetes.io/projected/fa996ab4-6084-48e7-92d1-518c14773d43-kube-api-access-c8w8t\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.857930 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.858043 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa996ab4-6084-48e7-92d1-518c14773d43-hosts-file\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.858107 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa996ab4-6084-48e7-92d1-518c14773d43-tmp-dir\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.858233 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8w8t\" (UniqueName: \"kubernetes.io/projected/fa996ab4-6084-48e7-92d1-518c14773d43-kube-api-access-c8w8t\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.858681 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.858698 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4d778455-w6fvh: secret "image-registry-tls" not found Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.858755 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls podName:4b822c28-ad3a-4754-8c9b-aeaf91af3b98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.858736155 +0000 UTC m=+34.656799108 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls") pod "image-registry-6b4d778455-w6fvh" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98") : secret "image-registry-tls" not found Apr 24 21:28:01.859549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.858816 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/fa996ab4-6084-48e7-92d1-518c14773d43-hosts-file\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.865098 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.860416 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa996ab4-6084-48e7-92d1-518c14773d43-tmp-dir\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.872478 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.872395 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8w8t\" (UniqueName: \"kubernetes.io/projected/fa996ab4-6084-48e7-92d1-518c14773d43-kube-api-access-c8w8t\") pod \"node-resolver-trd5d\" (UID: \"fa996ab4-6084-48e7-92d1-518c14773d43\") " pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.878347 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.878253 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" event={"ID":"d1681d40-7ce7-4810-b3ea-1c27861ac3d8","Type":"ContainerStarted","Data":"078ec40de55e6d74d50a7a7aa6824c3dfe6cebe0e52d7951923606af01051f0c"} Apr 24 21:28:01.892734 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.892648 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" event={"ID":"f161940e-db46-4df4-9316-662b09a296c4","Type":"ContainerStarted","Data":"5d265d6068a4430600d8fa4f6ef746ec7f467911c90a1f45ec3416e03bf59186"} Apr 24 21:28:01.938526 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.937095 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-trd5d" Apr 24 21:28:01.961182 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.960437 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:01.961182 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.960551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:01.961182 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.960588 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.961182 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.960653 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:01.961182 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.960794 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:01.961182 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.960862 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.960841517 +0000 UTC m=+34.758904484 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : secret "router-metrics-certs-default" not found Apr 24 21:28:01.962164 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.961931 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:01.962164 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.962013 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls podName:6cd08957-865f-4442-98d3-f5cd050c3fb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.961976911 +0000 UTC m=+34.760039865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-542j7" (UID: "6cd08957-865f-4442-98d3-f5cd050c3fb6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:01.962164 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.962074 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:01.962164 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.962110 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls podName:0df49e02-8899-47f3-804e-9975c913c649 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.962097078 +0000 UTC m=+34.760160035 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pxvnz" (UID: "0df49e02-8899-47f3-804e-9975c913c649") : secret "samples-operator-tls" not found Apr 24 21:28:01.962164 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:01.962128 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:02.962119158 +0000 UTC m=+34.760182112 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:01.985615 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.985581 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6"] Apr 24 21:28:01.992099 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:01.991820 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod553124ae_a77f_4ced_abab_751764ac01e1.slice/crio-2cedde63282c2b59b4a9371aaa4efdba26932ef4e6b39ce4a79a38fcb7da257d WatchSource:0}: Error finding container 2cedde63282c2b59b4a9371aaa4efdba26932ef4e6b39ce4a79a38fcb7da257d: Status 404 returned error can't find the container with id 2cedde63282c2b59b4a9371aaa4efdba26932ef4e6b39ce4a79a38fcb7da257d Apr 24 21:28:01.993421 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.993368 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl"] Apr 24 21:28:01.998348 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:01.998323 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-zxbmp"] Apr 24 21:28:01.998455 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:01.998388 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd41f9802_2338_451e_b318_e3503d861cab.slice/crio-e45e39504bed8b48c0f315d7fc778fb7d9bf617890e3661a4d77454132bdd93d WatchSource:0}: Error finding container e45e39504bed8b48c0f315d7fc778fb7d9bf617890e3661a4d77454132bdd93d: Status 404 returned error can't find the container with id e45e39504bed8b48c0f315d7fc778fb7d9bf617890e3661a4d77454132bdd93d Apr 24 21:28:02.001172 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:02.001149 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ae26ae6_db56_4447_825f_208c0ab19d34.slice/crio-6c59bb3957a55ef37a4026c0ee8b85d1220504098056f289366b0e73b006f271 WatchSource:0}: Error finding container 6c59bb3957a55ef37a4026c0ee8b85d1220504098056f289366b0e73b006f271: Status 404 returned error can't find the container with id 6c59bb3957a55ef37a4026c0ee8b85d1220504098056f289366b0e73b006f271 Apr 24 21:28:02.006665 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.006642 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm"] Apr 24 21:28:02.009851 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.009825 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5"] Apr 24 21:28:02.011168 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.010919 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-w6q7n"] Apr 24 21:28:02.016828 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:02.016799 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3b79033d_a984_4892_b6db_1971346abcd5.slice/crio-c1932448dd01a6bc9ec73a48473e944e10930cf731d020c0b73ffbbfb74930a4 WatchSource:0}: Error finding container c1932448dd01a6bc9ec73a48473e944e10930cf731d020c0b73ffbbfb74930a4: Status 404 returned error can't find the container with id c1932448dd01a6bc9ec73a48473e944e10930cf731d020c0b73ffbbfb74930a4 Apr 24 21:28:02.017369 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:02.017346 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod1f4c7e1b_114a_4d59_9aed_53e71a116cf9.slice/crio-1192f556b091481aee6900c134300795940b74cccae0c16c40ae64dc82cf4557 WatchSource:0}: Error finding container 1192f556b091481aee6900c134300795940b74cccae0c16c40ae64dc82cf4557: Status 404 returned error can't find the container with id 1192f556b091481aee6900c134300795940b74cccae0c16c40ae64dc82cf4557 Apr 24 21:28:02.018471 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:02.018445 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc9822dc_9a3d_4fdf_94b1_053fb0f0608b.slice/crio-42703645c953bae04e8ba984a90c29812ca9625fc1c20261e5506f654e30124d WatchSource:0}: Error finding container 42703645c953bae04e8ba984a90c29812ca9625fc1c20261e5506f654e30124d: Status 404 returned error can't find the container with id 42703645c953bae04e8ba984a90c29812ca9625fc1c20261e5506f654e30124d Apr 24 21:28:02.046311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.046276 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx"] Apr 24 21:28:02.050296 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:02.050158 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda4d3d8ce_54d9_4c28_8043_df84ae070d16.slice/crio-418127051f115ada6d2a2319c245ddc7d0e40b1b6d3a4123d3b73ad88a0431cd WatchSource:0}: Error finding container 418127051f115ada6d2a2319c245ddc7d0e40b1b6d3a4123d3b73ad88a0431cd: Status 404 returned error can't find the container with id 418127051f115ada6d2a2319c245ddc7d0e40b1b6d3a4123d3b73ad88a0431cd Apr 24 21:28:02.162471 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.162441 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:02.162644 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.162632 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:02.162681 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.162643 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:02.162722 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.162710 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert podName:545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:03.162689619 +0000 UTC m=+34.960752589 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert") pod "ingress-canary-5gnht" (UID: "545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804") : secret "canary-serving-cert" not found Apr 24 21:28:02.162764 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.162742 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:02.162807 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.162795 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls podName:78c8f2d7-06fd-41fd-91a1-02af0d79bea4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:03.16277813 +0000 UTC m=+34.960841092 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls") pod "dns-default-fp6xs" (UID: "78c8f2d7-06fd-41fd-91a1-02af0d79bea4") : secret "dns-default-metrics-tls" not found Apr 24 21:28:02.363873 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.363794 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:28:02.364082 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.363947 2580 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:02.364082 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.364043 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs podName:dff89703-eb5c-40dd-b22c-a598308414bc nodeName:}" failed. No retries permitted until 2026-04-24 21:28:34.364025178 +0000 UTC m=+66.162088138 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs") pod "network-metrics-daemon-c8k6b" (UID: "dff89703-eb5c-40dd-b22c-a598308414bc") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 24 21:28:02.464888 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.464856 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:02.468927 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.468896 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r962z\" (UniqueName: \"kubernetes.io/projected/9a4e6661-523d-4d4f-bd08-473deabd33c0-kube-api-access-r962z\") pod \"network-check-target-7k8nd\" (UID: \"9a4e6661-523d-4d4f-bd08-473deabd33c0\") " pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:02.685183 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.685154 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:28:02.685381 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.685211 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:02.686064 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.685758 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:28:02.688703 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.688172 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:02.688703 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.688526 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7528m\"" Apr 24 21:28:02.689305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.689080 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 24 21:28:02.689305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.689127 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-q7cg2\"" Apr 24 21:28:02.715744 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.714820 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:02.871777 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.869746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:02.871777 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.870736 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:02.871777 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.870758 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4d778455-w6fvh: secret "image-registry-tls" not found Apr 24 21:28:02.871777 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.870818 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls podName:4b822c28-ad3a-4754-8c9b-aeaf91af3b98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.870799823 +0000 UTC m=+36.668862771 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls") pod "image-registry-6b4d778455-w6fvh" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98") : secret "image-registry-tls" not found Apr 24 21:28:02.903317 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.903271 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" event={"ID":"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b","Type":"ContainerStarted","Data":"42703645c953bae04e8ba984a90c29812ca9625fc1c20261e5506f654e30124d"} Apr 24 21:28:02.908463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.908402 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" event={"ID":"8ae26ae6-db56-4447-825f-208c0ab19d34","Type":"ContainerStarted","Data":"6c59bb3957a55ef37a4026c0ee8b85d1220504098056f289366b0e73b006f271"} Apr 24 21:28:02.934023 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.933025 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-trd5d" event={"ID":"fa996ab4-6084-48e7-92d1-518c14773d43","Type":"ContainerStarted","Data":"45570a258717505d86971a369da216a226179c6b1dccddc39579cfd0c70087cf"} Apr 24 21:28:02.934023 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.933072 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-trd5d" event={"ID":"fa996ab4-6084-48e7-92d1-518c14773d43","Type":"ContainerStarted","Data":"c08cb13c07f16edb4393f49857588383715149311c6239fb983e0ef0b995c99b"} Apr 24 21:28:02.937497 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.937389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" event={"ID":"3b79033d-a984-4892-b6db-1971346abcd5","Type":"ContainerStarted","Data":"c1932448dd01a6bc9ec73a48473e944e10930cf731d020c0b73ffbbfb74930a4"} Apr 24 21:28:02.938965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.938907 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" event={"ID":"553124ae-a77f-4ced-abab-751764ac01e1","Type":"ContainerStarted","Data":"2cedde63282c2b59b4a9371aaa4efdba26932ef4e6b39ce4a79a38fcb7da257d"} Apr 24 21:28:02.945550 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.944709 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-7k8nd"] Apr 24 21:28:02.949195 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.948885 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" event={"ID":"d41f9802-2338-451e-b318-e3503d861cab","Type":"ContainerStarted","Data":"e45e39504bed8b48c0f315d7fc778fb7d9bf617890e3661a4d77454132bdd93d"} Apr 24 21:28:02.949458 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:02.949412 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a4e6661_523d_4d4f_bd08_473deabd33c0.slice/crio-8946dc973565a3255c42a442251dc38ad1d3dc03546bbff19ff4b5f9c36e9e6d WatchSource:0}: Error finding container 8946dc973565a3255c42a442251dc38ad1d3dc03546bbff19ff4b5f9c36e9e6d: Status 404 returned error can't find the container with id 8946dc973565a3255c42a442251dc38ad1d3dc03546bbff19ff4b5f9c36e9e6d Apr 24 21:28:02.953312 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.953144 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" event={"ID":"1f4c7e1b-114a-4d59-9aed-53e71a116cf9","Type":"ContainerStarted","Data":"1192f556b091481aee6900c134300795940b74cccae0c16c40ae64dc82cf4557"} Apr 24 21:28:02.970361 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.969051 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a0ab547-ecd2-4df3-9477-0144645571ea" containerID="eb723a1f25960bd0c0af344e923f1650029daf5c252d2a4653c54f2a637f1e74" exitCode=0 Apr 24 21:28:02.974059 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.970572 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerDied","Data":"eb723a1f25960bd0c0af344e923f1650029daf5c252d2a4653c54f2a637f1e74"} Apr 24 21:28:02.974059 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.973132 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-trd5d" podStartSLOduration=1.973114582 podStartE2EDuration="1.973114582s" podCreationTimestamp="2026-04-24 21:28:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:02.968861157 +0000 UTC m=+34.766924128" watchObservedRunningTime="2026-04-24 21:28:02.973114582 +0000 UTC m=+34.771177555" Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.974642 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.974706 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.974771 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.974861 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.975107 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.975088939 +0000 UTC m=+36.773151896 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.975639 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.975696 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls podName:0df49e02-8899-47f3-804e-9975c913c649 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.975672875 +0000 UTC m=+36.773735827 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pxvnz" (UID: "0df49e02-8899-47f3-804e-9975c913c649") : secret "samples-operator-tls" not found Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.976189 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:02.976371 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.976250 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.976218169 +0000 UTC m=+36.774281122 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : secret "router-metrics-certs-default" not found Apr 24 21:28:02.979742 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.979648 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:02.980020 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:02.979982 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls podName:6cd08957-865f-4442-98d3-f5cd050c3fb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:04.979682633 +0000 UTC m=+36.777745584 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-542j7" (UID: "6cd08957-865f-4442-98d3-f5cd050c3fb6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:02.981806 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:02.981758 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" event={"ID":"a4d3d8ce-54d9-4c28-8043-df84ae070d16","Type":"ContainerStarted","Data":"418127051f115ada6d2a2319c245ddc7d0e40b1b6d3a4123d3b73ad88a0431cd"} Apr 24 21:28:03.177174 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:03.177129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:03.177326 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:03.177250 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:03.177480 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:03.177457 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:03.177543 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:03.177520 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert podName:545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:05.177501849 +0000 UTC m=+36.975564800 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert") pod "ingress-canary-5gnht" (UID: "545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804") : secret "canary-serving-cert" not found Apr 24 21:28:03.177669 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:03.177457 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:03.177669 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:03.177647 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls podName:78c8f2d7-06fd-41fd-91a1-02af0d79bea4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:05.177630837 +0000 UTC m=+36.975693785 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls") pod "dns-default-fp6xs" (UID: "78c8f2d7-06fd-41fd-91a1-02af0d79bea4") : secret "dns-default-metrics-tls" not found Apr 24 21:28:04.010798 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.010482 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7k8nd" event={"ID":"9a4e6661-523d-4d4f-bd08-473deabd33c0","Type":"ContainerStarted","Data":"8946dc973565a3255c42a442251dc38ad1d3dc03546bbff19ff4b5f9c36e9e6d"} Apr 24 21:28:04.058499 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.057431 2580 generic.go:358] "Generic (PLEG): container finished" podID="5a0ab547-ecd2-4df3-9477-0144645571ea" containerID="ebf7dd4e94fe08b0ee5ab739f493a7930f275f8b38457dc584c4f8c97f8ed218" exitCode=0 Apr 24 21:28:04.058499 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.057539 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerDied","Data":"ebf7dd4e94fe08b0ee5ab739f493a7930f275f8b38457dc584c4f8c97f8ed218"} Apr 24 21:28:04.896971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.896921 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:04.897199 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.897182 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:04.897263 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.897204 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4d778455-w6fvh: secret "image-registry-tls" not found Apr 24 21:28:04.897263 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.897262 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls podName:4b822c28-ad3a-4754-8c9b-aeaf91af3b98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.897243863 +0000 UTC m=+40.695306816 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls") pod "image-registry-6b4d778455-w6fvh" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98") : secret "image-registry-tls" not found Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.998096 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.998194 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.998226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:04.998292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.998466 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.998529 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.99851101 +0000 UTC m=+40.796573963 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : secret "router-metrics-certs-default" not found Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.998927 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.998979 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls podName:6cd08957-865f-4442-98d3-f5cd050c3fb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.998963124 +0000 UTC m=+40.797026078 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-542j7" (UID: "6cd08957-865f-4442-98d3-f5cd050c3fb6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:04.999046 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.999016 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.999006583 +0000 UTC m=+40.797069545 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:04.999588 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.999080 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:04.999588 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:04.999117 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls podName:0df49e02-8899-47f3-804e-9975c913c649 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:08.999106975 +0000 UTC m=+40.797169928 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pxvnz" (UID: "0df49e02-8899-47f3-804e-9975c913c649") : secret "samples-operator-tls" not found Apr 24 21:28:05.200859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:05.199956 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:05.200859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:05.200110 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:05.200859 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:05.200277 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:05.200859 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:05.200339 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert podName:545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:09.200320334 +0000 UTC m=+40.998383288 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert") pod "ingress-canary-5gnht" (UID: "545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804") : secret "canary-serving-cert" not found Apr 24 21:28:05.200859 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:05.200404 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:05.200859 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:05.200433 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls podName:78c8f2d7-06fd-41fd-91a1-02af0d79bea4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:09.200423655 +0000 UTC m=+40.998486604 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls") pod "dns-default-fp6xs" (UID: "78c8f2d7-06fd-41fd-91a1-02af0d79bea4") : secret "dns-default-metrics-tls" not found Apr 24 21:28:08.937311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:08.937268 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:08.937667 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:08.937406 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:08.937667 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:08.937426 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4d778455-w6fvh: secret "image-registry-tls" not found Apr 24 21:28:08.937667 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:08.937482 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls podName:4b822c28-ad3a-4754-8c9b-aeaf91af3b98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:16.937465924 +0000 UTC m=+48.735528877 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls") pod "image-registry-6b4d778455-w6fvh" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98") : secret "image-registry-tls" not found Apr 24 21:28:09.038060 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:09.038019 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:09.038220 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:09.038101 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:09.038220 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:09.038124 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:09.038220 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038150 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:09.038220 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:09.038167 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:09.038364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038222 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls podName:6cd08957-865f-4442-98d3-f5cd050c3fb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.038200471 +0000 UTC m=+48.836263421 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-542j7" (UID: "6cd08957-865f-4442-98d3-f5cd050c3fb6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:09.038364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038244 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:09.038364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038262 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.038251514 +0000 UTC m=+48.836314477 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:09.038364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038297 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls podName:0df49e02-8899-47f3-804e-9975c913c649 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.038284796 +0000 UTC m=+48.836347745 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pxvnz" (UID: "0df49e02-8899-47f3-804e-9975c913c649") : secret "samples-operator-tls" not found Apr 24 21:28:09.038364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038250 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:09.038364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.038333 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.038322524 +0000 UTC m=+48.836385472 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : secret "router-metrics-certs-default" not found Apr 24 21:28:09.241168 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:09.241058 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:09.241337 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.241217 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:09.241337 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:09.241260 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:09.241337 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.241288 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert podName:545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.241266646 +0000 UTC m=+49.039329598 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert") pod "ingress-canary-5gnht" (UID: "545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804") : secret "canary-serving-cert" not found Apr 24 21:28:09.241337 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.241331 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:09.241533 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:09.241383 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls podName:78c8f2d7-06fd-41fd-91a1-02af0d79bea4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:17.241368133 +0000 UTC m=+49.039431081 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls") pod "dns-default-fp6xs" (UID: "78c8f2d7-06fd-41fd-91a1-02af0d79bea4") : secret "dns-default-metrics-tls" not found Apr 24 21:28:14.495093 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:14.495060 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:28:14.498634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:14.498611 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/74148c1b-8ecf-4750-b22d-3bb17904b081-original-pull-secret\") pod \"global-pull-secret-syncer-m4t52\" (UID: \"74148c1b-8ecf-4750-b22d-3bb17904b081\") " pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:28:14.704512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:14.704158 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-m4t52" Apr 24 21:28:14.881040 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:14.880989 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-m4t52"] Apr 24 21:28:14.885049 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:14.885009 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74148c1b_8ecf_4750_b22d_3bb17904b081.slice/crio-8a10741f5bf9ecef976654e7883608d8456ab51dfe6708934c83876e89e2c3a6 WatchSource:0}: Error finding container 8a10741f5bf9ecef976654e7883608d8456ab51dfe6708934c83876e89e2c3a6: Status 404 returned error can't find the container with id 8a10741f5bf9ecef976654e7883608d8456ab51dfe6708934c83876e89e2c3a6 Apr 24 21:28:15.094981 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.094863 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" event={"ID":"a4d3d8ce-54d9-4c28-8043-df84ae070d16","Type":"ContainerStarted","Data":"ea9d4164042ac1c99aa84aa45fcfb52fb201d85477cb1620d78f37e033aaef1a"} Apr 24 21:28:15.097263 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.096845 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" event={"ID":"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b","Type":"ContainerStarted","Data":"25bb5994c296204642b60fedbd1b803fa3fcf3fbc06ce5df4f5e92f7334c324e"} Apr 24 21:28:15.098319 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.098297 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/0.log" Apr 24 21:28:15.098392 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.098342 2580 generic.go:358] "Generic (PLEG): container finished" podID="8ae26ae6-db56-4447-825f-208c0ab19d34" containerID="ea679786de2c3d6d4c51d1625db99d4f4a643b04aaf98a102b290399e8a8a96a" exitCode=255 Apr 24 21:28:15.098392 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.098374 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" event={"ID":"8ae26ae6-db56-4447-825f-208c0ab19d34","Type":"ContainerDied","Data":"ea679786de2c3d6d4c51d1625db99d4f4a643b04aaf98a102b290399e8a8a96a"} Apr 24 21:28:15.098632 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.098616 2580 scope.go:117] "RemoveContainer" containerID="ea679786de2c3d6d4c51d1625db99d4f4a643b04aaf98a102b290399e8a8a96a" Apr 24 21:28:15.100528 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.100506 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" event={"ID":"f161940e-db46-4df4-9316-662b09a296c4","Type":"ContainerStarted","Data":"6ef6ea53b34be0af852c46baaa973e826dde0a356147e1873c84ef30532a9699"} Apr 24 21:28:15.109974 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.109937 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" event={"ID":"3b79033d-a984-4892-b6db-1971346abcd5","Type":"ContainerStarted","Data":"1c096c4af7265e66508e33daf0b24af8217684e7cfcd8d8706977516b88cc212"} Apr 24 21:28:15.111534 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.111508 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" event={"ID":"553124ae-a77f-4ced-abab-751764ac01e1","Type":"ContainerStarted","Data":"3ff9538a3a64a9d0907d79081dc8c909fdd3a94e378e8ad82df90b6fea0d8597"} Apr 24 21:28:15.112842 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.112816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" event={"ID":"d41f9802-2338-451e-b318-e3503d861cab","Type":"ContainerStarted","Data":"999048c07230ec0a59459295a911653125a68586855a3be1c95766814de9aa16"} Apr 24 21:28:15.113042 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.113025 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:15.114553 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.114507 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" event={"ID":"d1681d40-7ce7-4810-b3ea-1c27861ac3d8","Type":"ContainerStarted","Data":"2857887464cf23ed13116a2c320cd195bcefdaf54107cbdb107c89bd400f0d56"} Apr 24 21:28:15.115595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.115266 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" Apr 24 21:28:15.116379 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.116347 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" event={"ID":"1f4c7e1b-114a-4d59-9aed-53e71a116cf9","Type":"ContainerStarted","Data":"dd2c6da1dd8ca624bb279851bf3600612f5bc6f5df4df4b63fe0e7f8068af39d"} Apr 24 21:28:15.117705 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.117686 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-7k8nd" event={"ID":"9a4e6661-523d-4d4f-bd08-473deabd33c0","Type":"ContainerStarted","Data":"20ab5798753890b088bad57e0a231251be4e6b1b2fe56f8d02ad4812e5005477"} Apr 24 21:28:15.117806 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.117693 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" podStartSLOduration=6.918199207 podStartE2EDuration="19.117678157s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:28:02.052901042 +0000 UTC m=+33.850963991" lastFinishedPulling="2026-04-24 21:28:14.25237999 +0000 UTC m=+46.050442941" observedRunningTime="2026-04-24 21:28:15.115809741 +0000 UTC m=+46.913872725" watchObservedRunningTime="2026-04-24 21:28:15.117678157 +0000 UTC m=+46.915741130" Apr 24 21:28:15.117882 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.117824 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:15.118798 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.118777 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m4t52" event={"ID":"74148c1b-8ecf-4750-b22d-3bb17904b081","Type":"ContainerStarted","Data":"8a10741f5bf9ecef976654e7883608d8456ab51dfe6708934c83876e89e2c3a6"} Apr 24 21:28:15.122068 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.122046 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" event={"ID":"5a0ab547-ecd2-4df3-9477-0144645571ea","Type":"ContainerStarted","Data":"0bf28a787dd2301e4d0f5cfa4f693d168e43f7350d2a283f26ec7f4bab157a90"} Apr 24 21:28:15.144411 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.144359 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-hwxfm" podStartSLOduration=2.751447324 podStartE2EDuration="15.144340035s" podCreationTimestamp="2026-04-24 21:28:00 +0000 UTC" firstStartedPulling="2026-04-24 21:28:02.019273968 +0000 UTC m=+33.817336916" lastFinishedPulling="2026-04-24 21:28:14.41216667 +0000 UTC m=+46.210229627" observedRunningTime="2026-04-24 21:28:15.143103449 +0000 UTC m=+46.941166420" watchObservedRunningTime="2026-04-24 21:28:15.144340035 +0000 UTC m=+46.942403007" Apr 24 21:28:15.192714 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.192655 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-pl48h" podStartSLOduration=17.677895624 podStartE2EDuration="29.192638667s" podCreationTimestamp="2026-04-24 21:27:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.724107266 +0000 UTC m=+33.522170230" lastFinishedPulling="2026-04-24 21:28:13.238850317 +0000 UTC m=+45.036913273" observedRunningTime="2026-04-24 21:28:15.164492618 +0000 UTC m=+46.962555591" watchObservedRunningTime="2026-04-24 21:28:15.192638667 +0000 UTC m=+46.990701638" Apr 24 21:28:15.192969 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.192943 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/klusterlet-addon-workmgr-7c9cc58456-r64pl" podStartSLOduration=28.792171217 podStartE2EDuration="41.192937554s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.999958748 +0000 UTC m=+33.798021708" lastFinishedPulling="2026-04-24 21:28:14.400725083 +0000 UTC m=+46.198788045" observedRunningTime="2026-04-24 21:28:15.19167564 +0000 UTC m=+46.989738611" watchObservedRunningTime="2026-04-24 21:28:15.192937554 +0000 UTC m=+46.991000525" Apr 24 21:28:15.266701 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.266639 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" podStartSLOduration=16.88750041 podStartE2EDuration="29.266606175s" podCreationTimestamp="2026-04-24 21:27:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:02.020846052 +0000 UTC m=+33.818909015" lastFinishedPulling="2026-04-24 21:28:14.399951817 +0000 UTC m=+46.198014780" observedRunningTime="2026-04-24 21:28:15.264434797 +0000 UTC m=+47.062497766" watchObservedRunningTime="2026-04-24 21:28:15.266606175 +0000 UTC m=+47.064669147" Apr 24 21:28:15.322065 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.321205 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-7k8nd" podStartSLOduration=35.872235904 podStartE2EDuration="47.321184277s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:02.954059213 +0000 UTC m=+34.752122177" lastFinishedPulling="2026-04-24 21:28:14.403007598 +0000 UTC m=+46.201070550" observedRunningTime="2026-04-24 21:28:15.294136108 +0000 UTC m=+47.092199075" watchObservedRunningTime="2026-04-24 21:28:15.321184277 +0000 UTC m=+47.119247249" Apr 24 21:28:15.367307 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.367117 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" podStartSLOduration=6.83910184 podStartE2EDuration="19.367074658s" podCreationTimestamp="2026-04-24 21:27:56 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.724406163 +0000 UTC m=+33.522469115" lastFinishedPulling="2026-04-24 21:28:14.25237897 +0000 UTC m=+46.050441933" observedRunningTime="2026-04-24 21:28:15.321580788 +0000 UTC m=+47.119643759" watchObservedRunningTime="2026-04-24 21:28:15.367074658 +0000 UTC m=+47.165137629" Apr 24 21:28:15.411796 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.411662 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-6hjtw" podStartSLOduration=15.924017093 podStartE2EDuration="46.411643549s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="2026-04-24 21:27:31.265856399 +0000 UTC m=+3.063919351" lastFinishedPulling="2026-04-24 21:28:01.753482842 +0000 UTC m=+33.551545807" observedRunningTime="2026-04-24 21:28:15.411391547 +0000 UTC m=+47.209454518" watchObservedRunningTime="2026-04-24 21:28:15.411643549 +0000 UTC m=+47.209706521" Apr 24 21:28:15.414350 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:15.412202 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-865464b9cc-lrxn5" podStartSLOduration=29.029801087 podStartE2EDuration="41.412190831s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:02.019334266 +0000 UTC m=+33.817397229" lastFinishedPulling="2026-04-24 21:28:14.401724021 +0000 UTC m=+46.199786973" observedRunningTime="2026-04-24 21:28:15.369106741 +0000 UTC m=+47.167169713" watchObservedRunningTime="2026-04-24 21:28:15.412190831 +0000 UTC m=+47.210253803" Apr 24 21:28:16.127727 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:16.127635 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:28:16.129675 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:16.128535 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/0.log" Apr 24 21:28:16.129675 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:16.128575 2580 generic.go:358] "Generic (PLEG): container finished" podID="8ae26ae6-db56-4447-825f-208c0ab19d34" containerID="12caf291f6178cb042f835b38ccf0be4a9570729901a6644b42c9d1b40043370" exitCode=255 Apr 24 21:28:16.129675 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:16.129670 2580 scope.go:117] "RemoveContainer" containerID="12caf291f6178cb042f835b38ccf0be4a9570729901a6644b42c9d1b40043370" Apr 24 21:28:16.129917 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:16.129867 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zxbmp_openshift-console-operator(8ae26ae6-db56-4447-825f-208c0ab19d34)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" podUID="8ae26ae6-db56-4447-825f-208c0ab19d34" Apr 24 21:28:16.130141 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:16.130084 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" event={"ID":"8ae26ae6-db56-4447-825f-208c0ab19d34","Type":"ContainerDied","Data":"12caf291f6178cb042f835b38ccf0be4a9570729901a6644b42c9d1b40043370"} Apr 24 21:28:16.130141 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:16.130140 2580 scope.go:117] "RemoveContainer" containerID="ea679786de2c3d6d4c51d1625db99d4f4a643b04aaf98a102b290399e8a8a96a" Apr 24 21:28:17.031847 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.031810 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:17.032107 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.031970 2580 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 24 21:28:17.032107 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.032011 2580 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-6b4d778455-w6fvh: secret "image-registry-tls" not found Apr 24 21:28:17.032107 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.032087 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls podName:4b822c28-ad3a-4754-8c9b-aeaf91af3b98 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.032070373 +0000 UTC m=+64.830133325 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls") pod "image-registry-6b4d778455-w6fvh" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98") : secret "image-registry-tls" not found Apr 24 21:28:17.132401 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.132364 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.132455 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.132490 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.132551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132554 2580 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132580 2580 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132621 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls podName:6cd08957-865f-4442-98d3-f5cd050c3fb6 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.132602058 +0000 UTC m=+64.930665011 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-542j7" (UID: "6cd08957-865f-4442-98d3-f5cd050c3fb6") : secret "cluster-monitoring-operator-tls" not found Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132636 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls podName:0df49e02-8899-47f3-804e-9975c913c649 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.132630108 +0000 UTC m=+64.930693056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-pxvnz" (UID: "0df49e02-8899-47f3-804e-9975c913c649") : secret "samples-operator-tls" not found Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132675 2580 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 24 21:28:17.132746 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132725 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.132709319 +0000 UTC m=+64.930772274 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : secret "router-metrics-certs-default" not found Apr 24 21:28:17.133182 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.132793 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle podName:a318551b-2eb1-436a-8979-f4e740c2e662 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.132780924 +0000 UTC m=+64.930843872 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle") pod "router-default-7f4c79d4bd-dxrjx" (UID: "a318551b-2eb1-436a-8979-f4e740c2e662") : configmap references non-existent config key: service-ca.crt Apr 24 21:28:17.133260 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.133246 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:28:17.133727 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.133711 2580 scope.go:117] "RemoveContainer" containerID="12caf291f6178cb042f835b38ccf0be4a9570729901a6644b42c9d1b40043370" Apr 24 21:28:17.133958 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.133929 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zxbmp_openshift-console-operator(8ae26ae6-db56-4447-825f-208c0ab19d34)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" podUID="8ae26ae6-db56-4447-825f-208c0ab19d34" Apr 24 21:28:17.335829 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.335735 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:17.335988 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.335854 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:17.335988 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.335902 2580 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 24 21:28:17.335988 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.335951 2580 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 24 21:28:17.335988 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.335982 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls podName:78c8f2d7-06fd-41fd-91a1-02af0d79bea4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.335960137 +0000 UTC m=+65.134023087 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls") pod "dns-default-fp6xs" (UID: "78c8f2d7-06fd-41fd-91a1-02af0d79bea4") : secret "dns-default-metrics-tls" not found Apr 24 21:28:17.336203 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:17.336024 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert podName:545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.336007251 +0000 UTC m=+65.134070220 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert") pod "ingress-canary-5gnht" (UID: "545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804") : secret "canary-serving-cert" not found Apr 24 21:28:17.385691 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.385657 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-trd5d_fa996ab4-6084-48e7-92d1-518c14773d43/dns-node-resolver/0.log" Apr 24 21:28:17.800264 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.800228 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-6ck65"] Apr 24 21:28:17.803904 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.803875 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:17.807901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.807879 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 24 21:28:17.808114 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.808095 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 24 21:28:17.808227 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.808207 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qk948\"" Apr 24 21:28:17.816855 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.816832 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6ck65"] Apr 24 21:28:17.941788 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.941746 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w5llh\" (UniqueName: \"kubernetes.io/projected/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-kube-api-access-w5llh\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:17.941965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.941802 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-crio-socket\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:17.941965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.941852 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:17.941965 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.941945 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:17.942159 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:17.942055 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-data-volume\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.042782 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.042745 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w5llh\" (UniqueName: \"kubernetes.io/projected/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-kube-api-access-w5llh\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.042971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.042806 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-crio-socket\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.042971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.042845 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.042971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.042915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.043161 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.043013 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-data-volume\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.043161 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.043060 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-crio-socket\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.043261 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:18.043167 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.043261 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:18.043234 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls podName:e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:18.543213177 +0000 UTC m=+50.341276131 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6ck65" (UID: "e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.043448 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.043429 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-data-volume\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.043584 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.043567 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.055799 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.055713 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w5llh\" (UniqueName: \"kubernetes.io/projected/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-kube-api-access-w5llh\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.139248 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.138924 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" event={"ID":"553124ae-a77f-4ced-abab-751764ac01e1","Type":"ContainerStarted","Data":"00ec4fa98790b0949db4edef4288c0c13d25ffe4548e092dfcc58f4ba31e9769"} Apr 24 21:28:18.139248 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.138985 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" event={"ID":"553124ae-a77f-4ced-abab-751764ac01e1","Type":"ContainerStarted","Data":"6f38b36c832afd5d2eb91460387eab538297456868116e91a301ea00c8ddc11a"} Apr 24 21:28:18.164124 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.164065 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" podStartSLOduration=29.063899548 podStartE2EDuration="44.164046465s" podCreationTimestamp="2026-04-24 21:27:34 +0000 UTC" firstStartedPulling="2026-04-24 21:28:01.994183529 +0000 UTC m=+33.792246484" lastFinishedPulling="2026-04-24 21:28:17.094330439 +0000 UTC m=+48.892393401" observedRunningTime="2026-04-24 21:28:18.162489238 +0000 UTC m=+49.960552241" watchObservedRunningTime="2026-04-24 21:28:18.164046465 +0000 UTC m=+49.962109442" Apr 24 21:28:18.178855 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.178830 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pzhk2_b026702e-c0cf-4243-981d-4f64cfc8b0a0/node-ca/0.log" Apr 24 21:28:18.548632 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:18.548585 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:18.548810 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:18.548764 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:18.548863 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:18.548845 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls podName:e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:19.548822911 +0000 UTC m=+51.346885865 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6ck65" (UID: "e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:19.557108 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:19.557080 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:19.557444 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:19.557232 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:19.557444 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:19.557309 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls podName:e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:21.557292271 +0000 UTC m=+53.355355228 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6ck65" (UID: "e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:20.145523 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:20.145487 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-m4t52" event={"ID":"74148c1b-8ecf-4750-b22d-3bb17904b081","Type":"ContainerStarted","Data":"25e8bcfac85400f16e134923ef6482e734f1c4fe853cb8d89a94faeee39b5e94"} Apr 24 21:28:21.566487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:21.566449 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:21.566487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:21.566491 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:21.567031 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:21.566871 2580 scope.go:117] "RemoveContainer" containerID="12caf291f6178cb042f835b38ccf0be4a9570729901a6644b42c9d1b40043370" Apr 24 21:28:21.567088 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:21.567060 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-zxbmp_openshift-console-operator(8ae26ae6-db56-4447-825f-208c0ab19d34)\"" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" podUID="8ae26ae6-db56-4447-825f-208c0ab19d34" Apr 24 21:28:21.578337 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:21.578306 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:21.578492 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:21.578468 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:21.578549 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:21.578540 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls podName:e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:25.578525113 +0000 UTC m=+57.376588060 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6ck65" (UID: "e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:25.616292 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:25.616261 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:25.616708 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:25.616411 2580 secret.go:189] Couldn't get secret openshift-insights/insights-runtime-extractor-tls: secret "insights-runtime-extractor-tls" not found Apr 24 21:28:25.616708 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:25.616495 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls podName:e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a nodeName:}" failed. No retries permitted until 2026-04-24 21:28:33.616479428 +0000 UTC m=+65.414542376 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "insights-runtime-extractor-tls" (UniqueName: "kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls") pod "insights-runtime-extractor-6ck65" (UID: "e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a") : secret "insights-runtime-extractor-tls" not found Apr 24 21:28:27.847737 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:27.847702 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-bt4zz" Apr 24 21:28:27.882126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:27.882076 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-m4t52" podStartSLOduration=41.33974479 podStartE2EDuration="45.88205868s" podCreationTimestamp="2026-04-24 21:27:42 +0000 UTC" firstStartedPulling="2026-04-24 21:28:14.886943219 +0000 UTC m=+46.685006180" lastFinishedPulling="2026-04-24 21:28:19.429257118 +0000 UTC m=+51.227320070" observedRunningTime="2026-04-24 21:28:20.162256451 +0000 UTC m=+51.960319433" watchObservedRunningTime="2026-04-24 21:28:27.88205868 +0000 UTC m=+59.680121650" Apr 24 21:28:32.685097 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:32.685064 2580 scope.go:117] "RemoveContainer" containerID="12caf291f6178cb042f835b38ccf0be4a9570729901a6644b42c9d1b40043370" Apr 24 21:28:33.080822 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.080746 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:33.083100 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.083077 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"image-registry-6b4d778455-w6fvh\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:33.181170 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181146 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:33.181305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181181 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:33.181305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:33.181444 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181422 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:33.181650 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181505 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:28:33.181768 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181738 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" event={"ID":"8ae26ae6-db56-4447-825f-208c0ab19d34","Type":"ContainerStarted","Data":"6863ba6de1705e52d8ef4644ce82e4e7980b80c4f4083b5d1551fba6e2db23f9"} Apr 24 21:28:33.181831 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.181781 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a318551b-2eb1-436a-8979-f4e740c2e662-service-ca-bundle\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:33.182115 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.182098 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:33.184109 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.184091 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/0df49e02-8899-47f3-804e-9975c913c649-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-pxvnz\" (UID: \"0df49e02-8899-47f3-804e-9975c913c649\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:33.184194 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.184091 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/a318551b-2eb1-436a-8979-f4e740c2e662-metrics-certs\") pod \"router-default-7f4c79d4bd-dxrjx\" (UID: \"a318551b-2eb1-436a-8979-f4e740c2e662\") " pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:33.184194 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.184182 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/6cd08957-865f-4442-98d3-f5cd050c3fb6-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-542j7\" (UID: \"6cd08957-865f-4442-98d3-f5cd050c3fb6\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:33.202753 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.202706 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" podStartSLOduration=25.953267803 podStartE2EDuration="38.20269158s" podCreationTimestamp="2026-04-24 21:27:55 +0000 UTC" firstStartedPulling="2026-04-24 21:28:02.002957039 +0000 UTC m=+33.801019988" lastFinishedPulling="2026-04-24 21:28:14.252380804 +0000 UTC m=+46.050443765" observedRunningTime="2026-04-24 21:28:33.201590896 +0000 UTC m=+64.999653865" watchObservedRunningTime="2026-04-24 21:28:33.20269158 +0000 UTC m=+65.000754551" Apr 24 21:28:33.279197 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.279176 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-6crsc\"" Apr 24 21:28:33.287415 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.287396 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:33.291270 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.291249 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-bh9nr\"" Apr 24 21:28:33.299380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.299358 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" Apr 24 21:28:33.308561 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.308540 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-95fs2\"" Apr 24 21:28:33.316889 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.316869 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" Apr 24 21:28:33.330367 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.330337 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-hhxk7\"" Apr 24 21:28:33.338757 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.338598 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:33.384678 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.383368 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:33.384678 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.383482 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:33.388412 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.388315 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/78c8f2d7-06fd-41fd-91a1-02af0d79bea4-metrics-tls\") pod \"dns-default-fp6xs\" (UID: \"78c8f2d7-06fd-41fd-91a1-02af0d79bea4\") " pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:33.389221 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.388884 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804-cert\") pod \"ingress-canary-5gnht\" (UID: \"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804\") " pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:33.395627 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.395233 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-zxbmp" Apr 24 21:28:33.500131 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.500101 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-ssv7d\"" Apr 24 21:28:33.507259 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.507236 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-5gnht" Apr 24 21:28:33.516608 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.516591 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-xsnkf\"" Apr 24 21:28:33.538944 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.527579 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:33.538944 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.528518 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-6b4d778455-w6fvh"] Apr 24 21:28:33.538944 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.531910 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7"] Apr 24 21:28:33.538944 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.534539 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz"] Apr 24 21:28:33.541848 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.541819 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7f4c79d4bd-dxrjx"] Apr 24 21:28:33.550073 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:33.550042 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6cd08957_865f_4442_98d3_f5cd050c3fb6.slice/crio-c44cff5f5e323abecfdeb01af9acff9cbe2b1a6ccb5f85cda84519e2ca620948 WatchSource:0}: Error finding container c44cff5f5e323abecfdeb01af9acff9cbe2b1a6ccb5f85cda84519e2ca620948: Status 404 returned error can't find the container with id c44cff5f5e323abecfdeb01af9acff9cbe2b1a6ccb5f85cda84519e2ca620948 Apr 24 21:28:33.672039 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.671990 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-5gnht"] Apr 24 21:28:33.674511 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:33.674485 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod545ca8a5_6c3a_4c0f_bd52_f71dfa7a1804.slice/crio-f48bdadf3d9c8d0daeb2ca39bed2f93a0a6c76db0155685b2f3ccd19124bc9b2 WatchSource:0}: Error finding container f48bdadf3d9c8d0daeb2ca39bed2f93a0a6c76db0155685b2f3ccd19124bc9b2: Status 404 returned error can't find the container with id f48bdadf3d9c8d0daeb2ca39bed2f93a0a6c76db0155685b2f3ccd19124bc9b2 Apr 24 21:28:33.686845 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.686822 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:33.689775 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.689238 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-6ck65\" (UID: \"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a\") " pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:33.689980 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.689952 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-fp6xs"] Apr 24 21:28:33.694069 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:33.694048 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78c8f2d7_06fd_41fd_91a1_02af0d79bea4.slice/crio-911792d16eb37eaa5a1132e8a52795b8fd5b39e74b306c23db4d63468b6042ad WatchSource:0}: Error finding container 911792d16eb37eaa5a1132e8a52795b8fd5b39e74b306c23db4d63468b6042ad: Status 404 returned error can't find the container with id 911792d16eb37eaa5a1132e8a52795b8fd5b39e74b306c23db4d63468b6042ad Apr 24 21:28:33.716909 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.716885 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qk948\"" Apr 24 21:28:33.725120 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.725101 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-6ck65" Apr 24 21:28:33.862471 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:33.862378 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-6ck65"] Apr 24 21:28:33.866712 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:33.866685 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode442ca2c_20fc_4efd_a8db_bf2f9d0ae75a.slice/crio-f0732dfc07aead6b481f39f03cb1870437920da8ac235ecbee77d8dac490a9bd WatchSource:0}: Error finding container f0732dfc07aead6b481f39f03cb1870437920da8ac235ecbee77d8dac490a9bd: Status 404 returned error can't find the container with id f0732dfc07aead6b481f39f03cb1870437920da8ac235ecbee77d8dac490a9bd Apr 24 21:28:34.187472 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.187389 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ck65" event={"ID":"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a","Type":"ContainerStarted","Data":"83874f36ae99fa57d86f0310ac29374511a12e128bc24bdf289ce51c56adf9b2"} Apr 24 21:28:34.187472 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.187433 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ck65" event={"ID":"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a","Type":"ContainerStarted","Data":"f0732dfc07aead6b481f39f03cb1870437920da8ac235ecbee77d8dac490a9bd"} Apr 24 21:28:34.189459 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.189427 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" event={"ID":"a318551b-2eb1-436a-8979-f4e740c2e662","Type":"ContainerStarted","Data":"71a4530c5ed324758f83bee94707754d5fbb8e27c6d39d88d583e52a64bff39a"} Apr 24 21:28:34.189580 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.189465 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" event={"ID":"a318551b-2eb1-436a-8979-f4e740c2e662","Type":"ContainerStarted","Data":"b9607b2d09db2e2c414b3bb84419b3014bbd933dd5a99ecb0dbe4643b8a408b0"} Apr 24 21:28:34.190812 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.190773 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5gnht" event={"ID":"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804","Type":"ContainerStarted","Data":"f48bdadf3d9c8d0daeb2ca39bed2f93a0a6c76db0155685b2f3ccd19124bc9b2"} Apr 24 21:28:34.192514 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.192335 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" event={"ID":"4b822c28-ad3a-4754-8c9b-aeaf91af3b98","Type":"ContainerStarted","Data":"70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409"} Apr 24 21:28:34.192514 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.192368 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" event={"ID":"4b822c28-ad3a-4754-8c9b-aeaf91af3b98","Type":"ContainerStarted","Data":"c7585613044210e030122d104f62029b2c258781dea0c3f863e53f39629b425f"} Apr 24 21:28:34.192846 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.192715 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:34.193783 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.193749 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fp6xs" event={"ID":"78c8f2d7-06fd-41fd-91a1-02af0d79bea4","Type":"ContainerStarted","Data":"911792d16eb37eaa5a1132e8a52795b8fd5b39e74b306c23db4d63468b6042ad"} Apr 24 21:28:34.195579 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.195554 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" event={"ID":"0df49e02-8899-47f3-804e-9975c913c649","Type":"ContainerStarted","Data":"cbef7f0b7d43e8a22d923237fc6443a8e9b7d01f066b18f68b9793a933ac503d"} Apr 24 21:28:34.197198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.197148 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" event={"ID":"6cd08957-865f-4442-98d3-f5cd050c3fb6","Type":"ContainerStarted","Data":"c44cff5f5e323abecfdeb01af9acff9cbe2b1a6ccb5f85cda84519e2ca620948"} Apr 24 21:28:34.214135 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.213220 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" podStartSLOduration=48.213205404 podStartE2EDuration="48.213205404s" podCreationTimestamp="2026-04-24 21:27:46 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:34.21264813 +0000 UTC m=+66.010711113" watchObservedRunningTime="2026-04-24 21:28:34.213205404 +0000 UTC m=+66.011268368" Apr 24 21:28:34.233417 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.233370 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" podStartSLOduration=65.233353383 podStartE2EDuration="1m5.233353383s" podCreationTimestamp="2026-04-24 21:27:29 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:28:34.232438593 +0000 UTC m=+66.030501566" watchObservedRunningTime="2026-04-24 21:28:34.233353383 +0000 UTC m=+66.031416354" Apr 24 21:28:34.339355 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.339283 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:34.342850 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.342809 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:34.395089 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.393541 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:28:34.396221 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.396196 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 24 21:28:34.407476 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.407431 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/dff89703-eb5c-40dd-b22c-a598308414bc-metrics-certs\") pod \"network-metrics-daemon-c8k6b\" (UID: \"dff89703-eb5c-40dd-b22c-a598308414bc\") " pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:28:34.532613 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.532529 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-7528m\"" Apr 24 21:28:34.540914 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.540499 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-c8k6b" Apr 24 21:28:34.731626 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:34.731072 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-c8k6b"] Apr 24 21:28:34.739729 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:34.739687 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddff89703_eb5c_40dd_b22c_a598308414bc.slice/crio-ead65e05f24a40048e755207421b9473ff561d42d5692bf0c0a48b6513b42209 WatchSource:0}: Error finding container ead65e05f24a40048e755207421b9473ff561d42d5692bf0c0a48b6513b42209: Status 404 returned error can't find the container with id ead65e05f24a40048e755207421b9473ff561d42d5692bf0c0a48b6513b42209 Apr 24 21:28:35.202971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:35.202921 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c8k6b" event={"ID":"dff89703-eb5c-40dd-b22c-a598308414bc","Type":"ContainerStarted","Data":"ead65e05f24a40048e755207421b9473ff561d42d5692bf0c0a48b6513b42209"} Apr 24 21:28:35.203676 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:35.203335 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:35.204789 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:35.204769 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7f4c79d4bd-dxrjx" Apr 24 21:28:39.218759 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.218714 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fp6xs" event={"ID":"78c8f2d7-06fd-41fd-91a1-02af0d79bea4","Type":"ContainerStarted","Data":"2165c94cd586a018dbc019760a876f5270ac8e605dcd19553be258c8f67cd06f"} Apr 24 21:28:39.218759 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.218762 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-fp6xs" event={"ID":"78c8f2d7-06fd-41fd-91a1-02af0d79bea4","Type":"ContainerStarted","Data":"f902658bba4ad42657d1b652b8614fd1aee06c710dd2d27085dc020452cde83f"} Apr 24 21:28:39.219370 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.218911 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:39.225445 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.225413 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" event={"ID":"0df49e02-8899-47f3-804e-9975c913c649","Type":"ContainerStarted","Data":"be51fa76d77d77ff64f6de7d315fb934fe5bd623f0c6ec93b324565e6458a56e"} Apr 24 21:28:39.225601 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.225453 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" event={"ID":"0df49e02-8899-47f3-804e-9975c913c649","Type":"ContainerStarted","Data":"7a4dc5125f92fff1daf79913fa10b5ac91697466dc7302387f1d6c05ffb35cf6"} Apr 24 21:28:39.227336 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.227294 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" event={"ID":"6cd08957-865f-4442-98d3-f5cd050c3fb6","Type":"ContainerStarted","Data":"6c0622356278fa5ff90d10c467b8b8ac0f407536e4b491e8fb73c6b1963c20d6"} Apr 24 21:28:39.229069 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.229037 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c8k6b" event={"ID":"dff89703-eb5c-40dd-b22c-a598308414bc","Type":"ContainerStarted","Data":"119e7d4cee1f76a89a080762fab7b7493de12bdcb44b0cdb354ab4f11ad376e2"} Apr 24 21:28:39.229200 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.229073 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-c8k6b" event={"ID":"dff89703-eb5c-40dd-b22c-a598308414bc","Type":"ContainerStarted","Data":"75c29622070cadc126296e4a047949a970310cf189d82e0f66eb338b6c4ec5f3"} Apr 24 21:28:39.231131 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.231104 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ck65" event={"ID":"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a","Type":"ContainerStarted","Data":"b22100cd735ae5b73da4e04463463e17ea8a788def7db41bac8a9efd58301365"} Apr 24 21:28:39.232546 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.232517 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-5gnht" event={"ID":"545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804","Type":"ContainerStarted","Data":"c9b47bbb7250045e3edd81a390e36c52b246c7878a6a49d9f83e0eda88141eae"} Apr 24 21:28:39.240324 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.240258 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-fp6xs" podStartSLOduration=33.672501781 podStartE2EDuration="38.240225256s" podCreationTimestamp="2026-04-24 21:28:01 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.695745233 +0000 UTC m=+65.493808185" lastFinishedPulling="2026-04-24 21:28:38.263468697 +0000 UTC m=+70.061531660" observedRunningTime="2026-04-24 21:28:39.238529353 +0000 UTC m=+71.036592323" watchObservedRunningTime="2026-04-24 21:28:39.240225256 +0000 UTC m=+71.038288226" Apr 24 21:28:39.270167 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.269989 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-c8k6b" podStartSLOduration=67.695391316 podStartE2EDuration="1m11.269966849s" podCreationTimestamp="2026-04-24 21:27:28 +0000 UTC" firstStartedPulling="2026-04-24 21:28:34.743323586 +0000 UTC m=+66.541386541" lastFinishedPulling="2026-04-24 21:28:38.317899104 +0000 UTC m=+70.115962074" observedRunningTime="2026-04-24 21:28:39.269128492 +0000 UTC m=+71.067191464" watchObservedRunningTime="2026-04-24 21:28:39.269966849 +0000 UTC m=+71.068029820" Apr 24 21:28:39.301810 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.301743 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-542j7" podStartSLOduration=48.593288156 podStartE2EDuration="53.301722556s" podCreationTimestamp="2026-04-24 21:27:46 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.555074294 +0000 UTC m=+65.353137249" lastFinishedPulling="2026-04-24 21:28:38.263508695 +0000 UTC m=+70.061571649" observedRunningTime="2026-04-24 21:28:39.30128664 +0000 UTC m=+71.099349611" watchObservedRunningTime="2026-04-24 21:28:39.301722556 +0000 UTC m=+71.099785523" Apr 24 21:28:39.327202 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.327147 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-pxvnz" podStartSLOduration=40.661406232 podStartE2EDuration="45.327127821s" podCreationTimestamp="2026-04-24 21:27:54 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.597573711 +0000 UTC m=+65.395636659" lastFinishedPulling="2026-04-24 21:28:38.263295301 +0000 UTC m=+70.061358248" observedRunningTime="2026-04-24 21:28:39.324777397 +0000 UTC m=+71.122840366" watchObservedRunningTime="2026-04-24 21:28:39.327127821 +0000 UTC m=+71.125190788" Apr 24 21:28:39.345319 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:39.345248 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-5gnht" podStartSLOduration=33.757928263 podStartE2EDuration="38.345226356s" podCreationTimestamp="2026-04-24 21:28:01 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.676368374 +0000 UTC m=+65.474431322" lastFinishedPulling="2026-04-24 21:28:38.263666465 +0000 UTC m=+70.061729415" observedRunningTime="2026-04-24 21:28:39.34284722 +0000 UTC m=+71.140910193" watchObservedRunningTime="2026-04-24 21:28:39.345226356 +0000 UTC m=+71.143289327" Apr 24 21:28:40.236854 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:40.236812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-6ck65" event={"ID":"e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a","Type":"ContainerStarted","Data":"433077ce9092ca689b65bc9c1bbafc106085ccb87eec844a52a8e7368991841c"} Apr 24 21:28:40.277850 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:40.277731 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-6ck65" podStartSLOduration=17.25243526 podStartE2EDuration="23.277711194s" podCreationTimestamp="2026-04-24 21:28:17 +0000 UTC" firstStartedPulling="2026-04-24 21:28:33.978535878 +0000 UTC m=+65.776598840" lastFinishedPulling="2026-04-24 21:28:40.003811826 +0000 UTC m=+71.801874774" observedRunningTime="2026-04-24 21:28:40.276600757 +0000 UTC m=+72.074663727" watchObservedRunningTime="2026-04-24 21:28:40.277711194 +0000 UTC m=+72.075774165" Apr 24 21:28:46.131926 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:46.131805 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-7k8nd" Apr 24 21:28:47.518362 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.518325 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-c5fkj"] Apr 24 21:28:47.521894 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.521870 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.524707 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.524683 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 24 21:28:47.524937 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.524916 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-sgkz5\"" Apr 24 21:28:47.525187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.525166 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 24 21:28:47.525864 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.525842 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 24 21:28:47.525942 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.525891 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 24 21:28:47.599667 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599634 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-wtmp\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.599667 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599671 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d653d51-55b5-48dc-8ba4-a85466d08ff4-metrics-client-ca\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.599890 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599747 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-root\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.599890 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599794 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-textfile\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.599890 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599824 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-sys\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.599890 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599866 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.599890 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599886 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-tls\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.600082 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599901 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8x78g\" (UniqueName: \"kubernetes.io/projected/5d653d51-55b5-48dc-8ba4-a85466d08ff4-kube-api-access-8x78g\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.600082 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.599917 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-accelerators-collector-config\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700516 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700476 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-root\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700532 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-textfile\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-sys\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700601 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-root\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700670 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-sys\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700611 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700735 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-tls\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700760 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8x78g\" (UniqueName: \"kubernetes.io/projected/5d653d51-55b5-48dc-8ba4-a85466d08ff4-kube-api-access-8x78g\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700790 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-accelerators-collector-config\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700825 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-wtmp\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.700901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700851 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d653d51-55b5-48dc-8ba4-a85466d08ff4-metrics-client-ca\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.701147 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:47.700905 2580 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 24 21:28:47.701147 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.700937 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-textfile\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.701147 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:28:47.700987 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-tls podName:5d653d51-55b5-48dc-8ba4-a85466d08ff4 nodeName:}" failed. No retries permitted until 2026-04-24 21:28:48.200965832 +0000 UTC m=+79.999028803 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-tls") pod "node-exporter-c5fkj" (UID: "5d653d51-55b5-48dc-8ba4-a85466d08ff4") : secret "node-exporter-tls" not found Apr 24 21:28:47.701147 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.701119 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-wtmp\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.701469 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.701447 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5d653d51-55b5-48dc-8ba4-a85466d08ff4-metrics-client-ca\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.701641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.701619 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-accelerators-collector-config\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.703327 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.703310 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:47.713352 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:47.713326 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8x78g\" (UniqueName: \"kubernetes.io/projected/5d653d51-55b5-48dc-8ba4-a85466d08ff4-kube-api-access-8x78g\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:48.204978 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:48.204944 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-tls\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:48.207522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:48.207488 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/5d653d51-55b5-48dc-8ba4-a85466d08ff4-node-exporter-tls\") pod \"node-exporter-c5fkj\" (UID: \"5d653d51-55b5-48dc-8ba4-a85466d08ff4\") " pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:48.431128 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:48.431091 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-c5fkj" Apr 24 21:28:48.440445 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:28:48.440406 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5d653d51_55b5_48dc_8ba4_a85466d08ff4.slice/crio-d3188d524492763a863dfe31da8d5bef11753ec8edb90476301dfe0586b7a92f WatchSource:0}: Error finding container d3188d524492763a863dfe31da8d5bef11753ec8edb90476301dfe0586b7a92f: Status 404 returned error can't find the container with id d3188d524492763a863dfe31da8d5bef11753ec8edb90476301dfe0586b7a92f Apr 24 21:28:49.239908 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:49.239870 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-fp6xs" Apr 24 21:28:49.266067 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:49.266033 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5fkj" event={"ID":"5d653d51-55b5-48dc-8ba4-a85466d08ff4","Type":"ContainerStarted","Data":"d3188d524492763a863dfe31da8d5bef11753ec8edb90476301dfe0586b7a92f"} Apr 24 21:28:50.270548 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:50.270457 2580 generic.go:358] "Generic (PLEG): container finished" podID="5d653d51-55b5-48dc-8ba4-a85466d08ff4" containerID="9f70c2e888f9de876aa2e406d846b3adb72992a8f998dea392e177f2149b65b3" exitCode=0 Apr 24 21:28:50.270548 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:50.270503 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5fkj" event={"ID":"5d653d51-55b5-48dc-8ba4-a85466d08ff4","Type":"ContainerDied","Data":"9f70c2e888f9de876aa2e406d846b3adb72992a8f998dea392e177f2149b65b3"} Apr 24 21:28:51.275669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:51.275630 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5fkj" event={"ID":"5d653d51-55b5-48dc-8ba4-a85466d08ff4","Type":"ContainerStarted","Data":"05cabe5c3ce8dd7713b36c24189bba154178fd6e0aeae7a295fda7d6460c37c8"} Apr 24 21:28:51.275669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:51.275669 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-c5fkj" event={"ID":"5d653d51-55b5-48dc-8ba4-a85466d08ff4","Type":"ContainerStarted","Data":"9754a33f6d2fe97e549c6a61906a1b6d6bd01d7fb6b10a1f856fade8183bab8b"} Apr 24 21:28:51.298119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:51.298067 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-c5fkj" podStartSLOduration=3.047706956 podStartE2EDuration="4.298052009s" podCreationTimestamp="2026-04-24 21:28:47 +0000 UTC" firstStartedPulling="2026-04-24 21:28:48.442204434 +0000 UTC m=+80.240267397" lastFinishedPulling="2026-04-24 21:28:49.692549501 +0000 UTC m=+81.490612450" observedRunningTime="2026-04-24 21:28:51.296178706 +0000 UTC m=+83.094241676" watchObservedRunningTime="2026-04-24 21:28:51.298052009 +0000 UTC m=+83.096114981" Apr 24 21:28:53.291914 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:53.291876 2580 patch_prober.go:28] interesting pod/image-registry-6b4d778455-w6fvh container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 24 21:28:53.292311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:53.291930 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" podUID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 24 21:28:55.207402 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:55.207373 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:28:59.737453 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:28:59.737414 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b4d778455-w6fvh"] Apr 24 21:29:24.760390 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:24.760288 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" podUID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" containerName="registry" containerID="cri-o://70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409" gracePeriod=30 Apr 24 21:29:25.006059 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.006031 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:29:25.128794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128759 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-trusted-ca\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.128957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128808 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-installation-pull-secrets\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.128957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128838 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-ca-trust-extracted\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.128957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128871 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-bound-sa-token\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.128957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128904 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.128957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128925 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-image-registry-private-configuration\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.128957 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.128951 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gdxqd\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-kube-api-access-gdxqd\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.129305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.129028 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-certificates\") pod \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\" (UID: \"4b822c28-ad3a-4754-8c9b-aeaf91af3b98\") " Apr 24 21:29:25.129362 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.129315 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:25.129610 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.129582 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 24 21:29:25.131568 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.131524 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:25.131728 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.131701 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:25.131799 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.131782 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:29:25.131853 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.131826 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:25.131939 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.131909 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-kube-api-access-gdxqd" (OuterVolumeSpecName: "kube-api-access-gdxqd") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "kube-api-access-gdxqd". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:29:25.137907 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.137874 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4b822c28-ad3a-4754-8c9b-aeaf91af3b98" (UID: "4b822c28-ad3a-4754-8c9b-aeaf91af3b98"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:29:25.229851 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229808 2580 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-bound-sa-token\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.229851 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229842 2580 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-tls\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.229851 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229853 2580 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-image-registry-private-configuration\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.229851 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229864 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gdxqd\" (UniqueName: \"kubernetes.io/projected/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-kube-api-access-gdxqd\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.230155 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229874 2580 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-registry-certificates\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.230155 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229883 2580 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-trusted-ca\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.230155 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229892 2580 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-installation-pull-secrets\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.230155 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.229900 2580 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4b822c28-ad3a-4754-8c9b-aeaf91af3b98-ca-trust-extracted\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:29:25.373006 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.372951 2580 generic.go:358] "Generic (PLEG): container finished" podID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" containerID="70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409" exitCode=0 Apr 24 21:29:25.373170 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.373044 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" Apr 24 21:29:25.373170 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.373036 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" event={"ID":"4b822c28-ad3a-4754-8c9b-aeaf91af3b98","Type":"ContainerDied","Data":"70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409"} Apr 24 21:29:25.373170 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.373143 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-6b4d778455-w6fvh" event={"ID":"4b822c28-ad3a-4754-8c9b-aeaf91af3b98","Type":"ContainerDied","Data":"c7585613044210e030122d104f62029b2c258781dea0c3f863e53f39629b425f"} Apr 24 21:29:25.373170 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.373158 2580 scope.go:117] "RemoveContainer" containerID="70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409" Apr 24 21:29:25.381971 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.381951 2580 scope.go:117] "RemoveContainer" containerID="70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409" Apr 24 21:29:25.382316 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:29:25.382294 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409\": container with ID starting with 70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409 not found: ID does not exist" containerID="70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409" Apr 24 21:29:25.382368 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.382329 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409"} err="failed to get container status \"70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409\": rpc error: code = NotFound desc = could not find container \"70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409\": container with ID starting with 70e730b77f048daeb097262b475f58d06f55e8c590a8ef9d5736a1b038341409 not found: ID does not exist" Apr 24 21:29:25.397269 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.397234 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-6b4d778455-w6fvh"] Apr 24 21:29:25.403252 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:25.403218 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-6b4d778455-w6fvh"] Apr 24 21:29:26.688860 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:26.688825 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" path="/var/lib/kubelet/pods/4b822c28-ad3a-4754-8c9b-aeaf91af3b98/volumes" Apr 24 21:29:31.394885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:31.394850 2580 generic.go:358] "Generic (PLEG): container finished" podID="a4d3d8ce-54d9-4c28-8043-df84ae070d16" containerID="ea9d4164042ac1c99aa84aa45fcfb52fb201d85477cb1620d78f37e033aaef1a" exitCode=0 Apr 24 21:29:31.395287 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:31.394902 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" event={"ID":"a4d3d8ce-54d9-4c28-8043-df84ae070d16","Type":"ContainerDied","Data":"ea9d4164042ac1c99aa84aa45fcfb52fb201d85477cb1620d78f37e033aaef1a"} Apr 24 21:29:31.395287 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:31.395200 2580 scope.go:117] "RemoveContainer" containerID="ea9d4164042ac1c99aa84aa45fcfb52fb201d85477cb1620d78f37e033aaef1a" Apr 24 21:29:32.399151 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:32.399108 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-74bgx" event={"ID":"a4d3d8ce-54d9-4c28-8043-df84ae070d16","Type":"ContainerStarted","Data":"7223a173db973d1a00ff6b00df81f54aad7955a2b2220f72cf499fa0a1e0afc8"} Apr 24 21:29:41.642274 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:41.642234 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" podUID="553124ae-a77f-4ced-abab-751764ac01e1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:29:45.438295 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:45.438259 2580 generic.go:358] "Generic (PLEG): container finished" podID="fc9822dc-9a3d-4fdf-94b1-053fb0f0608b" containerID="25bb5994c296204642b60fedbd1b803fa3fcf3fbc06ce5df4f5e92f7334c324e" exitCode=0 Apr 24 21:29:45.438683 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:45.438319 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" event={"ID":"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b","Type":"ContainerDied","Data":"25bb5994c296204642b60fedbd1b803fa3fcf3fbc06ce5df4f5e92f7334c324e"} Apr 24 21:29:45.438683 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:45.438623 2580 scope.go:117] "RemoveContainer" containerID="25bb5994c296204642b60fedbd1b803fa3fcf3fbc06ce5df4f5e92f7334c324e" Apr 24 21:29:46.442815 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:46.442781 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-w6q7n" event={"ID":"fc9822dc-9a3d-4fdf-94b1-053fb0f0608b","Type":"ContainerStarted","Data":"70cf604ceb637a598a245d8e23b82527462a4a5973b1e36b4be651d0e22a29a1"} Apr 24 21:29:50.456071 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:50.456035 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1681d40-7ce7-4810-b3ea-1c27861ac3d8" containerID="2857887464cf23ed13116a2c320cd195bcefdaf54107cbdb107c89bd400f0d56" exitCode=0 Apr 24 21:29:50.456538 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:50.456114 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" event={"ID":"d1681d40-7ce7-4810-b3ea-1c27861ac3d8","Type":"ContainerDied","Data":"2857887464cf23ed13116a2c320cd195bcefdaf54107cbdb107c89bd400f0d56"} Apr 24 21:29:50.456614 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:50.456551 2580 scope.go:117] "RemoveContainer" containerID="2857887464cf23ed13116a2c320cd195bcefdaf54107cbdb107c89bd400f0d56" Apr 24 21:29:51.460510 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:51.460474 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-4kn9f" event={"ID":"d1681d40-7ce7-4810-b3ea-1c27861ac3d8","Type":"ContainerStarted","Data":"1650f892a1ae10f0c398095555017310bc30cc4b3e1081819d29ceed2c1a5cd3"} Apr 24 21:29:51.642218 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:29:51.642177 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" podUID="553124ae-a77f-4ced-abab-751764ac01e1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:01.643944 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:01.643892 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" podUID="553124ae-a77f-4ced-abab-751764ac01e1" containerName="service-proxy" probeResult="failure" output="HTTP probe failed with statuscode: 500" Apr 24 21:30:01.644464 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:01.643973 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" Apr 24 21:30:01.644464 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:01.644456 2580 kuberuntime_manager.go:1107] "Message for Container of pod" containerName="service-proxy" containerStatusID={"Type":"cri-o","ID":"00ec4fa98790b0949db4edef4288c0c13d25ffe4548e092dfcc58f4ba31e9769"} pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" containerMessage="Container service-proxy failed liveness probe, will be restarted" Apr 24 21:30:01.644551 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:01.644492 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" podUID="553124ae-a77f-4ced-abab-751764ac01e1" containerName="service-proxy" containerID="cri-o://00ec4fa98790b0949db4edef4288c0c13d25ffe4548e092dfcc58f4ba31e9769" gracePeriod=30 Apr 24 21:30:02.496643 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:02.496607 2580 generic.go:358] "Generic (PLEG): container finished" podID="553124ae-a77f-4ced-abab-751764ac01e1" containerID="00ec4fa98790b0949db4edef4288c0c13d25ffe4548e092dfcc58f4ba31e9769" exitCode=2 Apr 24 21:30:02.496643 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:02.496654 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" event={"ID":"553124ae-a77f-4ced-abab-751764ac01e1","Type":"ContainerDied","Data":"00ec4fa98790b0949db4edef4288c0c13d25ffe4548e092dfcc58f4ba31e9769"} Apr 24 21:30:02.496855 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:30:02.496679 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/cluster-proxy-proxy-agent-66f65d7c76-zsch6" event={"ID":"553124ae-a77f-4ced-abab-751764ac01e1","Type":"ContainerStarted","Data":"716b4a9549e0d6744f7e15e3be184f490a1d2d3f4765d32bfd08fc80991acc83"} Apr 24 21:32:28.604684 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:32:28.604652 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:32:28.605212 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:32:28.605116 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:32:28.608912 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:32:28.608890 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:32:28.609591 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:32:28.609572 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:32:28.615121 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:32:28.615103 2580 kubelet.go:1628] "Image garbage collection succeeded" Apr 24 21:34:49.529559 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.529455 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h"] Apr 24 21:34:49.530112 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.529887 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" containerName="registry" Apr 24 21:34:49.530112 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.529903 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" containerName="registry" Apr 24 21:34:49.530112 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.529988 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b822c28-ad3a-4754-8c9b-aeaf91af3b98" containerName="registry" Apr 24 21:34:49.533116 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.533100 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.537178 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.537024 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 24 21:34:49.537178 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.537073 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 24 21:34:49.537178 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.537088 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 24 21:34:49.537178 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.537108 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 24 21:34:49.537178 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.537115 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-b9ct4\"" Apr 24 21:34:49.537178 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.537144 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 24 21:34:49.560531 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.560503 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h"] Apr 24 21:34:49.640150 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.640119 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/edfa84d5-b4ca-4238-b102-34b23c953972-manager-config\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.640323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.640186 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edfa84d5-b4ca-4238-b102-34b23c953972-cert\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.640323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.640205 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/edfa84d5-b4ca-4238-b102-34b23c953972-metrics-cert\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.640323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.640234 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7gcv7\" (UniqueName: \"kubernetes.io/projected/edfa84d5-b4ca-4238-b102-34b23c953972-kube-api-access-7gcv7\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.741323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.741292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edfa84d5-b4ca-4238-b102-34b23c953972-cert\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.741323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.741323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/edfa84d5-b4ca-4238-b102-34b23c953972-metrics-cert\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.741530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.741358 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7gcv7\" (UniqueName: \"kubernetes.io/projected/edfa84d5-b4ca-4238-b102-34b23c953972-kube-api-access-7gcv7\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.741530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.741395 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/edfa84d5-b4ca-4238-b102-34b23c953972-manager-config\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.742141 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.742123 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/edfa84d5-b4ca-4238-b102-34b23c953972-manager-config\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.743880 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.743857 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/edfa84d5-b4ca-4238-b102-34b23c953972-metrics-cert\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.743973 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.743887 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/edfa84d5-b4ca-4238-b102-34b23c953972-cert\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.750571 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.750553 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7gcv7\" (UniqueName: \"kubernetes.io/projected/edfa84d5-b4ca-4238-b102-34b23c953972-kube-api-access-7gcv7\") pod \"lws-controller-manager-5774f66dc9-c6c4h\" (UID: \"edfa84d5-b4ca-4238-b102-34b23c953972\") " pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:49.842357 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:49.842257 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:50.001168 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:50.001132 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h"] Apr 24 21:34:50.004349 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:34:50.004321 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedfa84d5_b4ca_4238_b102_34b23c953972.slice/crio-9c667fb8ef148329dade07b3fd0b1e976d6652067d19a1838c6325197de32ac9 WatchSource:0}: Error finding container 9c667fb8ef148329dade07b3fd0b1e976d6652067d19a1838c6325197de32ac9: Status 404 returned error can't find the container with id 9c667fb8ef148329dade07b3fd0b1e976d6652067d19a1838c6325197de32ac9 Apr 24 21:34:50.006079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:50.006061 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:34:50.316051 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:50.316011 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" event={"ID":"edfa84d5-b4ca-4238-b102-34b23c953972","Type":"ContainerStarted","Data":"9c667fb8ef148329dade07b3fd0b1e976d6652067d19a1838c6325197de32ac9"} Apr 24 21:34:52.323154 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:52.323111 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" event={"ID":"edfa84d5-b4ca-4238-b102-34b23c953972","Type":"ContainerStarted","Data":"347d1b3771d753a242fec0b5a60f46e8c98850a4aad27e6228c6a7b8a72ad7f4"} Apr 24 21:34:52.323551 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:52.323247 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:34:52.357620 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:34:52.357547 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" podStartSLOduration=1.119474467 podStartE2EDuration="3.357525513s" podCreationTimestamp="2026-04-24 21:34:49 +0000 UTC" firstStartedPulling="2026-04-24 21:34:50.006180036 +0000 UTC m=+441.804242983" lastFinishedPulling="2026-04-24 21:34:52.244231081 +0000 UTC m=+444.042294029" observedRunningTime="2026-04-24 21:34:52.355458138 +0000 UTC m=+444.153521108" watchObservedRunningTime="2026-04-24 21:34:52.357525513 +0000 UTC m=+444.155588484" Apr 24 21:35:03.328886 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:35:03.328853 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-5774f66dc9-c6c4h" Apr 24 21:36:20.647285 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.647190 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d"] Apr 24 21:36:20.650331 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.650313 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.652932 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.652909 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 24 21:36:20.653104 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.652912 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 24 21:36:20.653250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.653236 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 24 21:36:20.653859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.653842 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 24 21:36:20.653859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.653851 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-99gpq\"" Apr 24 21:36:20.660591 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.660567 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d"] Apr 24 21:36:20.712678 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.712642 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dwpzd\" (UniqueName: \"kubernetes.io/projected/40b0cb6d-5360-4969-b911-a4d5822d26e9-kube-api-access-dwpzd\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.712859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.712697 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40b0cb6d-5360-4969-b911-a4d5822d26e9-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.712859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.712779 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b0cb6d-5360-4969-b911-a4d5822d26e9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.813756 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.813723 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40b0cb6d-5360-4969-b911-a4d5822d26e9-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.813919 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.813777 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b0cb6d-5360-4969-b911-a4d5822d26e9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.813919 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.813840 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dwpzd\" (UniqueName: \"kubernetes.io/projected/40b0cb6d-5360-4969-b911-a4d5822d26e9-kube-api-access-dwpzd\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.813919 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:36:20.813875 2580 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 24 21:36:20.814077 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:36:20.813967 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/40b0cb6d-5360-4969-b911-a4d5822d26e9-plugin-serving-cert podName:40b0cb6d-5360-4969-b911-a4d5822d26e9 nodeName:}" failed. No retries permitted until 2026-04-24 21:36:21.313944595 +0000 UTC m=+533.112007546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/40b0cb6d-5360-4969-b911-a4d5822d26e9-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-bvg5d" (UID: "40b0cb6d-5360-4969-b911-a4d5822d26e9") : secret "plugin-serving-cert" not found Apr 24 21:36:20.814430 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.814411 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/40b0cb6d-5360-4969-b911-a4d5822d26e9-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:20.825637 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:20.825615 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dwpzd\" (UniqueName: \"kubernetes.io/projected/40b0cb6d-5360-4969-b911-a4d5822d26e9-kube-api-access-dwpzd\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:21.319364 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:21.319324 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b0cb6d-5360-4969-b911-a4d5822d26e9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:21.321838 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:21.321809 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/40b0cb6d-5360-4969-b911-a4d5822d26e9-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-bvg5d\" (UID: \"40b0cb6d-5360-4969-b911-a4d5822d26e9\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:21.559712 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:21.559680 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" Apr 24 21:36:21.698975 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:21.698946 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d"] Apr 24 21:36:21.701521 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:36:21.701490 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod40b0cb6d_5360_4969_b911_a4d5822d26e9.slice/crio-0f8795782bb740f7adeafe000b5fe6841e8490eded8207dadbbf373c483008f9 WatchSource:0}: Error finding container 0f8795782bb740f7adeafe000b5fe6841e8490eded8207dadbbf373c483008f9: Status 404 returned error can't find the container with id 0f8795782bb740f7adeafe000b5fe6841e8490eded8207dadbbf373c483008f9 Apr 24 21:36:22.587235 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:22.587178 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" event={"ID":"40b0cb6d-5360-4969-b911-a4d5822d26e9","Type":"ContainerStarted","Data":"0f8795782bb740f7adeafe000b5fe6841e8490eded8207dadbbf373c483008f9"} Apr 24 21:36:26.603027 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:26.602921 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" event={"ID":"40b0cb6d-5360-4969-b911-a4d5822d26e9","Type":"ContainerStarted","Data":"9648fc1666096a67410ec0b758394624d05a58fae032756511ec4c2e2c5eac5f"} Apr 24 21:36:26.622655 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:36:26.622607 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-bvg5d" podStartSLOduration=2.0951484479999998 podStartE2EDuration="6.622593193s" podCreationTimestamp="2026-04-24 21:36:20 +0000 UTC" firstStartedPulling="2026-04-24 21:36:21.702880254 +0000 UTC m=+533.500943202" lastFinishedPulling="2026-04-24 21:36:26.230325 +0000 UTC m=+538.028387947" observedRunningTime="2026-04-24 21:36:26.620601825 +0000 UTC m=+538.418664797" watchObservedRunningTime="2026-04-24 21:36:26.622593193 +0000 UTC m=+538.420656163" Apr 24 21:37:03.309881 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.309840 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-mrc5l"] Apr 24 21:37:03.312483 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.312464 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.314829 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.314807 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 24 21:37:03.322690 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.322666 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-mrc5l"] Apr 24 21:37:03.344593 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.344559 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dx8lz\" (UniqueName: \"kubernetes.io/projected/7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040-kube-api-access-dx8lz\") pod \"limitador-limitador-67566c68b4-mrc5l\" (UID: \"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040\") " pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.344770 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.344624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040-config-file\") pod \"limitador-limitador-67566c68b4-mrc5l\" (UID: \"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040\") " pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.361387 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.361352 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-mrc5l"] Apr 24 21:37:03.445767 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.445707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dx8lz\" (UniqueName: \"kubernetes.io/projected/7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040-kube-api-access-dx8lz\") pod \"limitador-limitador-67566c68b4-mrc5l\" (UID: \"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040\") " pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.445956 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.445793 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040-config-file\") pod \"limitador-limitador-67566c68b4-mrc5l\" (UID: \"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040\") " pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.446377 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.446359 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040-config-file\") pod \"limitador-limitador-67566c68b4-mrc5l\" (UID: \"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040\") " pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.455336 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.455309 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dx8lz\" (UniqueName: \"kubernetes.io/projected/7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040-kube-api-access-dx8lz\") pod \"limitador-limitador-67566c68b4-mrc5l\" (UID: \"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040\") " pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.622320 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.622279 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:03.757931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.757901 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-67566c68b4-mrc5l"] Apr 24 21:37:03.760963 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:37:03.760935 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7fd39c61_a41c_4b55_8bdd_3a5ecfb5e040.slice/crio-13e702ecbe59500e0198bc2d18f90c442d284f36a26476d1aca2806e84e5f186 WatchSource:0}: Error finding container 13e702ecbe59500e0198bc2d18f90c442d284f36a26476d1aca2806e84e5f186: Status 404 returned error can't find the container with id 13e702ecbe59500e0198bc2d18f90c442d284f36a26476d1aca2806e84e5f186 Apr 24 21:37:03.798875 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.798842 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-j86ds"] Apr 24 21:37:03.801676 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.801657 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:03.803976 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.803957 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-v2n49\"" Apr 24 21:37:03.809484 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.809463 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-j86ds"] Apr 24 21:37:03.848706 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.848670 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w84k9\" (UniqueName: \"kubernetes.io/projected/52ba4153-d746-4bdc-bb03-fe26a33297ff-kube-api-access-w84k9\") pod \"authorino-79cbc94b89-j86ds\" (UID: \"52ba4153-d746-4bdc-bb03-fe26a33297ff\") " pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:03.949792 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.949707 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w84k9\" (UniqueName: \"kubernetes.io/projected/52ba4153-d746-4bdc-bb03-fe26a33297ff-kube-api-access-w84k9\") pod \"authorino-79cbc94b89-j86ds\" (UID: \"52ba4153-d746-4bdc-bb03-fe26a33297ff\") " pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:03.958702 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:03.958677 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w84k9\" (UniqueName: \"kubernetes.io/projected/52ba4153-d746-4bdc-bb03-fe26a33297ff-kube-api-access-w84k9\") pod \"authorino-79cbc94b89-j86ds\" (UID: \"52ba4153-d746-4bdc-bb03-fe26a33297ff\") " pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:04.111459 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:04.111419 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:04.235463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:04.235389 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-j86ds"] Apr 24 21:37:04.238221 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:37:04.238188 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod52ba4153_d746_4bdc_bb03_fe26a33297ff.slice/crio-8d4e82f0f963f3aef958304542f7cf6dcf46b758decc59cf14f8be6429cab665 WatchSource:0}: Error finding container 8d4e82f0f963f3aef958304542f7cf6dcf46b758decc59cf14f8be6429cab665: Status 404 returned error can't find the container with id 8d4e82f0f963f3aef958304542f7cf6dcf46b758decc59cf14f8be6429cab665 Apr 24 21:37:04.715908 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:04.715875 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" event={"ID":"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040","Type":"ContainerStarted","Data":"13e702ecbe59500e0198bc2d18f90c442d284f36a26476d1aca2806e84e5f186"} Apr 24 21:37:04.717016 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:04.716965 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-j86ds" event={"ID":"52ba4153-d746-4bdc-bb03-fe26a33297ff","Type":"ContainerStarted","Data":"8d4e82f0f963f3aef958304542f7cf6dcf46b758decc59cf14f8be6429cab665"} Apr 24 21:37:05.721952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:05.721907 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" event={"ID":"7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040","Type":"ContainerStarted","Data":"9cdd7d1357c8703fa983f612eaf2b50493917c84a43e062ab15207b5a7664db8"} Apr 24 21:37:05.722461 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:05.722140 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:05.739934 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:05.739863 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" podStartSLOduration=1.600789623 podStartE2EDuration="2.739843229s" podCreationTimestamp="2026-04-24 21:37:03 +0000 UTC" firstStartedPulling="2026-04-24 21:37:03.76263099 +0000 UTC m=+575.560693938" lastFinishedPulling="2026-04-24 21:37:04.901684581 +0000 UTC m=+576.699747544" observedRunningTime="2026-04-24 21:37:05.737943466 +0000 UTC m=+577.536006436" watchObservedRunningTime="2026-04-24 21:37:05.739843229 +0000 UTC m=+577.537906200" Apr 24 21:37:07.730603 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:07.730572 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-j86ds" event={"ID":"52ba4153-d746-4bdc-bb03-fe26a33297ff","Type":"ContainerStarted","Data":"2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867"} Apr 24 21:37:16.727700 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:16.727674 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-67566c68b4-mrc5l" Apr 24 21:37:16.747597 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:16.747551 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-j86ds" podStartSLOduration=11.315270067 podStartE2EDuration="13.747511739s" podCreationTimestamp="2026-04-24 21:37:03 +0000 UTC" firstStartedPulling="2026-04-24 21:37:04.239562923 +0000 UTC m=+576.037625870" lastFinishedPulling="2026-04-24 21:37:06.67180459 +0000 UTC m=+578.469867542" observedRunningTime="2026-04-24 21:37:07.745792414 +0000 UTC m=+579.543855385" watchObservedRunningTime="2026-04-24 21:37:16.747511739 +0000 UTC m=+588.545599660" Apr 24 21:37:27.844412 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:27.844380 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-j86ds"] Apr 24 21:37:27.844898 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:27.844591 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-j86ds" podUID="52ba4153-d746-4bdc-bb03-fe26a33297ff" containerName="authorino" containerID="cri-o://2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867" gracePeriod=30 Apr 24 21:37:28.082698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.082669 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:28.244557 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.244513 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w84k9\" (UniqueName: \"kubernetes.io/projected/52ba4153-d746-4bdc-bb03-fe26a33297ff-kube-api-access-w84k9\") pod \"52ba4153-d746-4bdc-bb03-fe26a33297ff\" (UID: \"52ba4153-d746-4bdc-bb03-fe26a33297ff\") " Apr 24 21:37:28.246677 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.246645 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/52ba4153-d746-4bdc-bb03-fe26a33297ff-kube-api-access-w84k9" (OuterVolumeSpecName: "kube-api-access-w84k9") pod "52ba4153-d746-4bdc-bb03-fe26a33297ff" (UID: "52ba4153-d746-4bdc-bb03-fe26a33297ff"). InnerVolumeSpecName "kube-api-access-w84k9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:37:28.345465 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.345414 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w84k9\" (UniqueName: \"kubernetes.io/projected/52ba4153-d746-4bdc-bb03-fe26a33297ff-kube-api-access-w84k9\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:37:28.633078 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.633047 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:37:28.633613 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.633591 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:37:28.637384 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.637365 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:37:28.637782 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.637763 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:37:28.801625 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.801582 2580 generic.go:358] "Generic (PLEG): container finished" podID="52ba4153-d746-4bdc-bb03-fe26a33297ff" containerID="2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867" exitCode=0 Apr 24 21:37:28.801794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.801643 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-j86ds" Apr 24 21:37:28.801794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.801668 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-j86ds" event={"ID":"52ba4153-d746-4bdc-bb03-fe26a33297ff","Type":"ContainerDied","Data":"2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867"} Apr 24 21:37:28.801794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.801703 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-j86ds" event={"ID":"52ba4153-d746-4bdc-bb03-fe26a33297ff","Type":"ContainerDied","Data":"8d4e82f0f963f3aef958304542f7cf6dcf46b758decc59cf14f8be6429cab665"} Apr 24 21:37:28.801794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.801719 2580 scope.go:117] "RemoveContainer" containerID="2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867" Apr 24 21:37:28.809401 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.809385 2580 scope.go:117] "RemoveContainer" containerID="2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867" Apr 24 21:37:28.809633 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:37:28.809617 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867\": container with ID starting with 2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867 not found: ID does not exist" containerID="2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867" Apr 24 21:37:28.809683 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.809644 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867"} err="failed to get container status \"2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867\": rpc error: code = NotFound desc = could not find container \"2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867\": container with ID starting with 2ccb1540dbf8a110b15ceed041ece27d77e5645e434994726a163ee1ab46d867 not found: ID does not exist" Apr 24 21:37:28.818928 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.818899 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-j86ds"] Apr 24 21:37:28.823444 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:28.823422 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-j86ds"] Apr 24 21:37:30.688241 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:37:30.688205 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="52ba4153-d746-4bdc-bb03-fe26a33297ff" path="/var/lib/kubelet/pods/52ba4153-d746-4bdc-bb03-fe26a33297ff/volumes" Apr 24 21:39:13.658929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.658888 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-6dhp2"] Apr 24 21:39:13.659418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.659190 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="52ba4153-d746-4bdc-bb03-fe26a33297ff" containerName="authorino" Apr 24 21:39:13.659418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.659203 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="52ba4153-d746-4bdc-bb03-fe26a33297ff" containerName="authorino" Apr 24 21:39:13.659418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.659266 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="52ba4153-d746-4bdc-bb03-fe26a33297ff" containerName="authorino" Apr 24 21:39:13.662139 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.662123 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6dhp2" Apr 24 21:39:13.664551 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.664532 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 24 21:39:13.665235 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.665217 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 24 21:39:13.665318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.665219 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 24 21:39:13.665318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.665223 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-bxpd9\"" Apr 24 21:39:13.668738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.668710 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6dhp2"] Apr 24 21:39:13.695981 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.695945 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gjst9\" (UniqueName: \"kubernetes.io/projected/989e4967-b2c7-40ab-bf6b-df37a1303bf4-kube-api-access-gjst9\") pod \"s3-init-6dhp2\" (UID: \"989e4967-b2c7-40ab-bf6b-df37a1303bf4\") " pod="kserve/s3-init-6dhp2" Apr 24 21:39:13.796924 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.796883 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gjst9\" (UniqueName: \"kubernetes.io/projected/989e4967-b2c7-40ab-bf6b-df37a1303bf4-kube-api-access-gjst9\") pod \"s3-init-6dhp2\" (UID: \"989e4967-b2c7-40ab-bf6b-df37a1303bf4\") " pod="kserve/s3-init-6dhp2" Apr 24 21:39:13.805929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.805895 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gjst9\" (UniqueName: \"kubernetes.io/projected/989e4967-b2c7-40ab-bf6b-df37a1303bf4-kube-api-access-gjst9\") pod \"s3-init-6dhp2\" (UID: \"989e4967-b2c7-40ab-bf6b-df37a1303bf4\") " pod="kserve/s3-init-6dhp2" Apr 24 21:39:13.972115 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:13.972032 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6dhp2" Apr 24 21:39:14.095963 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:14.095928 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-6dhp2"] Apr 24 21:39:14.099061 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:39:14.099034 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod989e4967_b2c7_40ab_bf6b_df37a1303bf4.slice/crio-c95bcab2e1c893a696cdcffe01d84837b34ea93341878de613d6b6e6b8954db9 WatchSource:0}: Error finding container c95bcab2e1c893a696cdcffe01d84837b34ea93341878de613d6b6e6b8954db9: Status 404 returned error can't find the container with id c95bcab2e1c893a696cdcffe01d84837b34ea93341878de613d6b6e6b8954db9 Apr 24 21:39:14.116931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:14.116904 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6dhp2" event={"ID":"989e4967-b2c7-40ab-bf6b-df37a1303bf4","Type":"ContainerStarted","Data":"c95bcab2e1c893a696cdcffe01d84837b34ea93341878de613d6b6e6b8954db9"} Apr 24 21:39:19.134708 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:19.134665 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6dhp2" event={"ID":"989e4967-b2c7-40ab-bf6b-df37a1303bf4","Type":"ContainerStarted","Data":"cb0422dca0e8adb4903994b425694a3500b3dee51812992542c5dfeeb74640de"} Apr 24 21:39:19.156332 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:19.156268 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-6dhp2" podStartSLOduration=1.7513133600000002 podStartE2EDuration="6.15625166s" podCreationTimestamp="2026-04-24 21:39:13 +0000 UTC" firstStartedPulling="2026-04-24 21:39:14.100847797 +0000 UTC m=+705.898910760" lastFinishedPulling="2026-04-24 21:39:18.505786109 +0000 UTC m=+710.303849060" observedRunningTime="2026-04-24 21:39:19.153668976 +0000 UTC m=+710.951731947" watchObservedRunningTime="2026-04-24 21:39:19.15625166 +0000 UTC m=+710.954314631" Apr 24 21:39:22.145283 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:22.145250 2580 generic.go:358] "Generic (PLEG): container finished" podID="989e4967-b2c7-40ab-bf6b-df37a1303bf4" containerID="cb0422dca0e8adb4903994b425694a3500b3dee51812992542c5dfeeb74640de" exitCode=0 Apr 24 21:39:22.145659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:22.145312 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6dhp2" event={"ID":"989e4967-b2c7-40ab-bf6b-df37a1303bf4","Type":"ContainerDied","Data":"cb0422dca0e8adb4903994b425694a3500b3dee51812992542c5dfeeb74640de"} Apr 24 21:39:23.278733 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:23.278709 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6dhp2" Apr 24 21:39:23.378323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:23.378287 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gjst9\" (UniqueName: \"kubernetes.io/projected/989e4967-b2c7-40ab-bf6b-df37a1303bf4-kube-api-access-gjst9\") pod \"989e4967-b2c7-40ab-bf6b-df37a1303bf4\" (UID: \"989e4967-b2c7-40ab-bf6b-df37a1303bf4\") " Apr 24 21:39:23.380515 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:23.380477 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/989e4967-b2c7-40ab-bf6b-df37a1303bf4-kube-api-access-gjst9" (OuterVolumeSpecName: "kube-api-access-gjst9") pod "989e4967-b2c7-40ab-bf6b-df37a1303bf4" (UID: "989e4967-b2c7-40ab-bf6b-df37a1303bf4"). InnerVolumeSpecName "kube-api-access-gjst9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:39:23.479849 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:23.479760 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gjst9\" (UniqueName: \"kubernetes.io/projected/989e4967-b2c7-40ab-bf6b-df37a1303bf4-kube-api-access-gjst9\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:39:24.151181 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:24.151143 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-6dhp2" event={"ID":"989e4967-b2c7-40ab-bf6b-df37a1303bf4","Type":"ContainerDied","Data":"c95bcab2e1c893a696cdcffe01d84837b34ea93341878de613d6b6e6b8954db9"} Apr 24 21:39:24.151181 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:24.151178 2580 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="c95bcab2e1c893a696cdcffe01d84837b34ea93341878de613d6b6e6b8954db9" Apr 24 21:39:24.151181 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:39:24.151175 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-6dhp2" Apr 24 21:40:00.605565 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.605533 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s"] Apr 24 21:40:00.606038 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.605822 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="989e4967-b2c7-40ab-bf6b-df37a1303bf4" containerName="s3-init" Apr 24 21:40:00.606038 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.605833 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="989e4967-b2c7-40ab-bf6b-df37a1303bf4" containerName="s3-init" Apr 24 21:40:00.606038 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.605895 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="989e4967-b2c7-40ab-bf6b-df37a1303bf4" containerName="s3-init" Apr 24 21:40:00.644888 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.644847 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s"] Apr 24 21:40:00.645074 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.645025 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.647572 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.647547 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:40:00.648309 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.648291 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:40:00.648437 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.648314 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 24 21:40:00.648437 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.648317 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:40:00.691909 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.691878 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tmp-dir\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.692138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.691919 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-model-cache\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.692138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.691940 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.692138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.691967 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-26gpj\" (UniqueName: \"kubernetes.io/projected/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kube-api-access-26gpj\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.692138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.692105 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-home\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.692313 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.692140 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-dshm\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.692313 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.692208 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793217 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793159 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-dshm\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793217 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793521 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793257 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tmp-dir\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793521 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793285 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-model-cache\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793521 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793303 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793521 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793330 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-26gpj\" (UniqueName: \"kubernetes.io/projected/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kube-api-access-26gpj\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793521 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793402 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-home\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793773 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793750 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tmp-dir\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793825 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793785 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793825 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793801 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-model-cache\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.793888 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.793840 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-home\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.795522 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.795504 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-dshm\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.795780 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.795763 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.801505 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.801482 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-26gpj\" (UniqueName: \"kubernetes.io/projected/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kube-api-access-26gpj\") pod \"scheduler-inline-config-test-kserve-d6b549768-lcd6s\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:00.955633 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:00.955600 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:40:01.128261 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:01.128232 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s"] Apr 24 21:40:01.130874 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:40:01.130842 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd9cb97e8_4392_44ce_9a0e_1e28b1694ea0.slice/crio-70292fdde715bb53b606d0eaa9d46c44e98bec1b8cef70b01f5c9dd0da65bf72 WatchSource:0}: Error finding container 70292fdde715bb53b606d0eaa9d46c44e98bec1b8cef70b01f5c9dd0da65bf72: Status 404 returned error can't find the container with id 70292fdde715bb53b606d0eaa9d46c44e98bec1b8cef70b01f5c9dd0da65bf72 Apr 24 21:40:01.133184 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:01.133166 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:40:01.255808 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:01.255719 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" event={"ID":"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0","Type":"ContainerStarted","Data":"70292fdde715bb53b606d0eaa9d46c44e98bec1b8cef70b01f5c9dd0da65bf72"} Apr 24 21:40:05.270447 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:05.270403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" event={"ID":"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0","Type":"ContainerStarted","Data":"1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9"} Apr 24 21:40:31.101319 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.101274 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx"] Apr 24 21:40:31.118118 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.118088 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx"] Apr 24 21:40:31.118272 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.118213 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.120659 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.120632 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 24 21:40:31.249274 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249233 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vb8vj\" (UniqueName: \"kubernetes.io/projected/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kube-api-access-vb8vj\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.249481 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249287 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.249481 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249410 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.249481 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249444 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.249622 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249483 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.249622 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249516 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.249622 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.249544 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.350733 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.350697 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vb8vj\" (UniqueName: \"kubernetes.io/projected/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kube-api-access-vb8vj\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.350921 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.350750 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.350921 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.350811 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.350921 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.350834 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351112 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.350953 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351112 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.351024 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351112 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.351069 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351425 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.351366 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351425 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.351400 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351570 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.351432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.351570 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.351472 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.353185 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.353164 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.353450 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.353434 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.359532 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.359510 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vb8vj\" (UniqueName: \"kubernetes.io/projected/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kube-api-access-vb8vj\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.429748 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.429714 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:40:31.562923 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:31.562891 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx"] Apr 24 21:40:31.566443 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:40:31.566406 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde2d78af_ec84_496e_8a7e_811a69e5a7c1.slice/crio-49de0a80fe2f5d2619204a1a55c0d88a971bb027cf09850469f46e918a46d3b7 WatchSource:0}: Error finding container 49de0a80fe2f5d2619204a1a55c0d88a971bb027cf09850469f46e918a46d3b7: Status 404 returned error can't find the container with id 49de0a80fe2f5d2619204a1a55c0d88a971bb027cf09850469f46e918a46d3b7 Apr 24 21:40:32.350925 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:32.350893 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" event={"ID":"de2d78af-ec84-496e-8a7e-811a69e5a7c1","Type":"ContainerStarted","Data":"d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7"} Apr 24 21:40:32.350925 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:40:32.350927 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" event={"ID":"de2d78af-ec84-496e-8a7e-811a69e5a7c1","Type":"ContainerStarted","Data":"49de0a80fe2f5d2619204a1a55c0d88a971bb027cf09850469f46e918a46d3b7"} Apr 24 21:42:25.688064 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:25.688026 2580 generic.go:358] "Generic (PLEG): container finished" podID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerID="1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9" exitCode=0 Apr 24 21:42:25.688485 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:25.688097 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" event={"ID":"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0","Type":"ContainerDied","Data":"1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9"} Apr 24 21:42:27.695400 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:27.695367 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" event={"ID":"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0","Type":"ContainerStarted","Data":"d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539"} Apr 24 21:42:27.715950 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:27.715895 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" podStartSLOduration=2.192313263 podStartE2EDuration="2m27.715878035s" podCreationTimestamp="2026-04-24 21:40:00 +0000 UTC" firstStartedPulling="2026-04-24 21:40:01.133322075 +0000 UTC m=+752.931385023" lastFinishedPulling="2026-04-24 21:42:26.656886845 +0000 UTC m=+898.454949795" observedRunningTime="2026-04-24 21:42:27.714463402 +0000 UTC m=+899.512526372" watchObservedRunningTime="2026-04-24 21:42:27.715878035 +0000 UTC m=+899.513941005" Apr 24 21:42:28.655511 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:28.655476 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:42:28.655700 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:28.655686 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:42:28.659357 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:28.659334 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:42:28.659483 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:28.659421 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:42:30.956082 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:30.956039 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:42:30.956554 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:30.956123 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:42:30.968693 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:30.968661 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:42:31.718416 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:31.718384 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:42:32.813776 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:32.813740 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s"] Apr 24 21:42:32.826648 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:32.826617 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-inline-config-test-kserve-self-signed-certs: secret "scheduler-inline-config-test-kserve-self-signed-certs" not found Apr 24 21:42:32.826805 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:32.826701 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs podName:d9cb97e8-4392-44ce-9a0e-1e28b1694ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:42:33.326684857 +0000 UTC m=+905.124747804 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs") pod "scheduler-inline-config-test-kserve-d6b549768-lcd6s" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0") : secret "scheduler-inline-config-test-kserve-self-signed-certs" not found Apr 24 21:42:33.330306 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:33.330271 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-inline-config-test-kserve-self-signed-certs: secret "scheduler-inline-config-test-kserve-self-signed-certs" not found Apr 24 21:42:33.330492 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:33.330346 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs podName:d9cb97e8-4392-44ce-9a0e-1e28b1694ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:42:34.330331591 +0000 UTC m=+906.128394539 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs") pod "scheduler-inline-config-test-kserve-d6b549768-lcd6s" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0") : secret "scheduler-inline-config-test-kserve-self-signed-certs" not found Apr 24 21:42:34.340236 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:34.340197 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-inline-config-test-kserve-self-signed-certs: secret "scheduler-inline-config-test-kserve-self-signed-certs" not found Apr 24 21:42:34.340612 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:34.340267 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs podName:d9cb97e8-4392-44ce-9a0e-1e28b1694ea0 nodeName:}" failed. No retries permitted until 2026-04-24 21:42:36.340252339 +0000 UTC m=+908.138315288 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs") pod "scheduler-inline-config-test-kserve-d6b549768-lcd6s" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0") : secret "scheduler-inline-config-test-kserve-self-signed-certs" not found Apr 24 21:42:34.717327 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:34.717262 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerName="main" containerID="cri-o://d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539" gracePeriod=30 Apr 24 21:42:34.989449 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:34.989416 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:42:35.045737 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045689 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tmp-dir\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.045940 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045762 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-dshm\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.045940 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045790 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-model-cache\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.045940 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045812 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.045940 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045843 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-home\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.045940 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045877 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-26gpj\" (UniqueName: \"kubernetes.io/projected/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kube-api-access-26gpj\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.045940 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.045906 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kserve-provision-location\") pod \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\" (UID: \"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0\") " Apr 24 21:42:35.046318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.046025 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:35.046318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.046174 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-model-cache" (OuterVolumeSpecName: "model-cache") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:35.046318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.046226 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:35.046318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.046287 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-home" (OuterVolumeSpecName: "home") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:35.048400 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.048363 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kube-api-access-26gpj" (OuterVolumeSpecName: "kube-api-access-26gpj") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "kube-api-access-26gpj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:42:35.048531 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.048495 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-dshm" (OuterVolumeSpecName: "dshm") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:35.048531 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.048495 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:42:35.146873 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.146824 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-26gpj\" (UniqueName: \"kubernetes.io/projected/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kube-api-access-26gpj\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:35.146873 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.146862 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:35.146873 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.146871 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:35.146873 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.146882 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:35.146873 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.146890 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:35.721803 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.721768 2580 generic.go:358] "Generic (PLEG): container finished" podID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerID="d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539" exitCode=0 Apr 24 21:42:35.722205 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.721846 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" Apr 24 21:42:35.722205 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.721853 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" event={"ID":"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0","Type":"ContainerDied","Data":"d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539"} Apr 24 21:42:35.722205 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.721889 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s" event={"ID":"d9cb97e8-4392-44ce-9a0e-1e28b1694ea0","Type":"ContainerDied","Data":"70292fdde715bb53b606d0eaa9d46c44e98bec1b8cef70b01f5c9dd0da65bf72"} Apr 24 21:42:35.722205 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.721905 2580 scope.go:117] "RemoveContainer" containerID="d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539" Apr 24 21:42:35.730739 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.730719 2580 scope.go:117] "RemoveContainer" containerID="1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9" Apr 24 21:42:35.792219 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.792193 2580 scope.go:117] "RemoveContainer" containerID="d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539" Apr 24 21:42:35.792572 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:35.792539 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539\": container with ID starting with d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539 not found: ID does not exist" containerID="d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539" Apr 24 21:42:35.792683 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.792573 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539"} err="failed to get container status \"d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539\": rpc error: code = NotFound desc = could not find container \"d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539\": container with ID starting with d143ad46187d18db1af58a35963e8d01036fbf7558a0b796b8e9008caecff539 not found: ID does not exist" Apr 24 21:42:35.792683 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.792592 2580 scope.go:117] "RemoveContainer" containerID="1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9" Apr 24 21:42:35.792891 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:42:35.792873 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9\": container with ID starting with 1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9 not found: ID does not exist" containerID="1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9" Apr 24 21:42:35.792930 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:35.792899 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9"} err="failed to get container status \"1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9\": rpc error: code = NotFound desc = could not find container \"1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9\": container with ID starting with 1366b86f3ba6e54440822f7b52eef7d4c930b9ea4893dbb7adcd156ae4eb52d9 not found: ID does not exist" Apr 24 21:42:36.546889 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:36.546829 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" (UID: "d9cb97e8-4392-44ce-9a0e-1e28b1694ea0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:42:36.556926 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:36.556895 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:42:36.644136 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:36.644099 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s"] Apr 24 21:42:36.649271 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:36.649240 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-d6b549768-lcd6s"] Apr 24 21:42:36.689142 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:42:36.689100 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" path="/var/lib/kubelet/pods/d9cb97e8-4392-44ce-9a0e-1e28b1694ea0/volumes" Apr 24 21:43:24.875690 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:24.875655 2580 generic.go:358] "Generic (PLEG): container finished" podID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerID="d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7" exitCode=0 Apr 24 21:43:24.876136 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:24.875733 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" event={"ID":"de2d78af-ec84-496e-8a7e-811a69e5a7c1","Type":"ContainerDied","Data":"d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7"} Apr 24 21:43:25.880311 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:25.880276 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" event={"ID":"de2d78af-ec84-496e-8a7e-811a69e5a7c1","Type":"ContainerStarted","Data":"cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab"} Apr 24 21:43:25.901717 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:25.901648 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" podStartSLOduration=174.901632106 podStartE2EDuration="2m54.901632106s" podCreationTimestamp="2026-04-24 21:40:31 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:43:25.900739611 +0000 UTC m=+957.698802582" watchObservedRunningTime="2026-04-24 21:43:25.901632106 +0000 UTC m=+957.699695141" Apr 24 21:43:31.430274 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:31.430226 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:43:31.430274 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:31.430282 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:43:31.443456 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:31.443422 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:43:31.909885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:31.909856 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:43:33.023028 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:43:33.022969 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:43:33.023487 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:43:33.023092 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs podName:de2d78af-ec84-496e-8a7e-811a69e5a7c1 nodeName:}" failed. No retries permitted until 2026-04-24 21:43:33.523067876 +0000 UTC m=+965.321130823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:43:33.084985 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:33.084940 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx"] Apr 24 21:43:33.527486 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:43:33.527448 2580 secret.go:189] Couldn't get secret kserve-ci-e2e-test/llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs: secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:43:33.527668 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:43:33.527528 2580 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs podName:de2d78af-ec84-496e-8a7e-811a69e5a7c1 nodeName:}" failed. No retries permitted until 2026-04-24 21:43:34.527512006 +0000 UTC m=+966.325574955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs") pod "llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1") : secret "llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs" not found Apr 24 21:43:33.904439 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:33.904394 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerName="main" containerID="cri-o://cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab" gracePeriod=30 Apr 24 21:43:34.151532 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.151505 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:43:34.234607 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234514 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-dshm\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.234761 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234650 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.234761 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234745 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vb8vj\" (UniqueName: \"kubernetes.io/projected/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kube-api-access-vb8vj\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.234838 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234786 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-model-cache\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.234838 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234821 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-home\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.234912 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234869 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kserve-provision-location\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.234912 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.234890 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tmp-dir\") pod \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\" (UID: \"de2d78af-ec84-496e-8a7e-811a69e5a7c1\") " Apr 24 21:43:34.235101 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.235068 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-model-cache" (OuterVolumeSpecName: "model-cache") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:34.235197 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.235166 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-home" (OuterVolumeSpecName: "home") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:34.235355 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.235334 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:34.236986 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.236961 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kube-api-access-vb8vj" (OuterVolumeSpecName: "kube-api-access-vb8vj") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "kube-api-access-vb8vj". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:43:34.236986 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.236970 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:43:34.237241 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.237220 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-dshm" (OuterVolumeSpecName: "dshm") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:34.293987 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.293936 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de2d78af-ec84-496e-8a7e-811a69e5a7c1" (UID: "de2d78af-ec84-496e-8a7e-811a69e5a7c1"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:43:34.336158 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336122 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.336158 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336156 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vb8vj\" (UniqueName: \"kubernetes.io/projected/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kube-api-access-vb8vj\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.336315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336170 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.336315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336180 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.336315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336189 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.336315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336198 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.336315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.336207 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de2d78af-ec84-496e-8a7e-811a69e5a7c1-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:43:34.908704 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.908662 2580 generic.go:358] "Generic (PLEG): container finished" podID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerID="cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab" exitCode=0 Apr 24 21:43:34.908876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.908742 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" event={"ID":"de2d78af-ec84-496e-8a7e-811a69e5a7c1","Type":"ContainerDied","Data":"cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab"} Apr 24 21:43:34.908876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.908766 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" Apr 24 21:43:34.908876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.908783 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx" event={"ID":"de2d78af-ec84-496e-8a7e-811a69e5a7c1","Type":"ContainerDied","Data":"49de0a80fe2f5d2619204a1a55c0d88a971bb027cf09850469f46e918a46d3b7"} Apr 24 21:43:34.908876 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.908800 2580 scope.go:117] "RemoveContainer" containerID="cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab" Apr 24 21:43:34.916829 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.916805 2580 scope.go:117] "RemoveContainer" containerID="d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7" Apr 24 21:43:34.929010 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.928953 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx"] Apr 24 21:43:34.935833 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.935803 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-5ddc4dcf76f29tx"] Apr 24 21:43:34.979884 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.979856 2580 scope.go:117] "RemoveContainer" containerID="cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab" Apr 24 21:43:34.980262 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:43:34.980240 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab\": container with ID starting with cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab not found: ID does not exist" containerID="cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab" Apr 24 21:43:34.980358 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.980272 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab"} err="failed to get container status \"cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab\": rpc error: code = NotFound desc = could not find container \"cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab\": container with ID starting with cadd83391ad8d238dd8fed2f6950c1265671305573c94fbacd747253a65938ab not found: ID does not exist" Apr 24 21:43:34.980358 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.980292 2580 scope.go:117] "RemoveContainer" containerID="d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7" Apr 24 21:43:34.980569 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:43:34.980552 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7\": container with ID starting with d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7 not found: ID does not exist" containerID="d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7" Apr 24 21:43:34.980618 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:34.980575 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7"} err="failed to get container status \"d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7\": rpc error: code = NotFound desc = could not find container \"d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7\": container with ID starting with d392115d6bcc4210e173e474d868e3e988b4539f9f19274537adf9a2afc650c7 not found: ID does not exist" Apr 24 21:43:36.688186 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:36.688153 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" path="/var/lib/kubelet/pods/de2d78af-ec84-496e-8a7e-811a69e5a7c1/volumes" Apr 24 21:43:52.708428 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708343 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j"] Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708656 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerName="storage-initializer" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708667 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerName="storage-initializer" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708676 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerName="main" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708682 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerName="main" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708689 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerName="storage-initializer" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708695 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerName="storage-initializer" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708708 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerName="main" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708717 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerName="main" Apr 24 21:43:52.708772 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708776 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d9cb97e8-4392-44ce-9a0e-1e28b1694ea0" containerName="main" Apr 24 21:43:52.709074 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.708787 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="de2d78af-ec84-496e-8a7e-811a69e5a7c1" containerName="main" Apr 24 21:43:52.711534 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.711511 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.714188 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.714153 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 24 21:43:52.714319 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.714262 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 24 21:43:52.714319 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.714261 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-bbrxr\"" Apr 24 21:43:52.714435 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.714259 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 24 21:43:52.721377 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.721353 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j"] Apr 24 21:43:52.785478 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785433 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.785478 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785481 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.785742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785512 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.785742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785540 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83bed8e4-5185-4c49-943a-0b47d64de7c9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.785742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785620 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.785742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785673 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.785742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.785706 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5jdng\" (UniqueName: \"kubernetes.io/projected/83bed8e4-5185-4c49-943a-0b47d64de7c9-kube-api-access-5jdng\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886271 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886226 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886271 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886275 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886525 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886409 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5jdng\" (UniqueName: \"kubernetes.io/projected/83bed8e4-5185-4c49-943a-0b47d64de7c9-kube-api-access-5jdng\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886525 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886528 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886594 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83bed8e4-5185-4c49-943a-0b47d64de7c9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886627 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886818 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886780 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.886878 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886861 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.887024 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.886970 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.888607 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.888587 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.889019 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.888983 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83bed8e4-5185-4c49-943a-0b47d64de7c9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:52.895412 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:52.895378 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5jdng\" (UniqueName: \"kubernetes.io/projected/83bed8e4-5185-4c49-943a-0b47d64de7c9-kube-api-access-5jdng\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:53.022883 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:53.022782 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:43:53.164502 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:53.164460 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j"] Apr 24 21:43:53.168416 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:43:53.168372 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod83bed8e4_5185_4c49_943a_0b47d64de7c9.slice/crio-f37749baf09199cf6183a34cc87246e68f438dbf2b96cbedf39193cc77d24f5b WatchSource:0}: Error finding container f37749baf09199cf6183a34cc87246e68f438dbf2b96cbedf39193cc77d24f5b: Status 404 returned error can't find the container with id f37749baf09199cf6183a34cc87246e68f438dbf2b96cbedf39193cc77d24f5b Apr 24 21:43:53.976900 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:53.976866 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" event={"ID":"83bed8e4-5185-4c49-943a-0b47d64de7c9","Type":"ContainerStarted","Data":"ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7"} Apr 24 21:43:53.976900 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:53.976906 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" event={"ID":"83bed8e4-5185-4c49-943a-0b47d64de7c9","Type":"ContainerStarted","Data":"f37749baf09199cf6183a34cc87246e68f438dbf2b96cbedf39193cc77d24f5b"} Apr 24 21:43:57.990505 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:57.990469 2580 generic.go:358] "Generic (PLEG): container finished" podID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerID="ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7" exitCode=0 Apr 24 21:43:57.990888 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:43:57.990513 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" event={"ID":"83bed8e4-5185-4c49-943a-0b47d64de7c9","Type":"ContainerDied","Data":"ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7"} Apr 24 21:44:26.090839 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:26.090805 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" event={"ID":"83bed8e4-5185-4c49-943a-0b47d64de7c9","Type":"ContainerStarted","Data":"966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db"} Apr 24 21:44:26.113260 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:26.113207 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podStartSLOduration=6.833648436 podStartE2EDuration="34.113190192s" podCreationTimestamp="2026-04-24 21:43:52 +0000 UTC" firstStartedPulling="2026-04-24 21:43:57.991649815 +0000 UTC m=+989.789712763" lastFinishedPulling="2026-04-24 21:44:25.271191565 +0000 UTC m=+1017.069254519" observedRunningTime="2026-04-24 21:44:26.11261524 +0000 UTC m=+1017.910678210" watchObservedRunningTime="2026-04-24 21:44:26.113190192 +0000 UTC m=+1017.911253175" Apr 24 21:44:33.023245 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:33.023211 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:44:33.023634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:33.023259 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:44:33.024757 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:33.024731 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:44:38.506084 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.506048 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt"] Apr 24 21:44:38.819323 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.819229 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt"] Apr 24 21:44:38.819480 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.819379 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:38.822503 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.822415 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 24 21:44:38.822503 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.822432 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-w6fwt\"" Apr 24 21:44:38.916606 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.916572 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:38.916776 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.916701 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87054e03-6255-407b-85c5-11b53b0c5f50-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:38.916776 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.916732 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:38.916776 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.916761 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:38.916925 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.916819 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:38.916925 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:38.916843 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hsxhp\" (UniqueName: \"kubernetes.io/projected/87054e03-6255-407b-85c5-11b53b0c5f50-kube-api-access-hsxhp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.017530 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.017484 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.017716 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.017545 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87054e03-6255-407b-85c5-11b53b0c5f50-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.017716 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.017571 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.017716 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.017589 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.017716 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.017630 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.017716 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.017654 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hsxhp\" (UniqueName: \"kubernetes.io/projected/87054e03-6255-407b-85c5-11b53b0c5f50-kube-api-access-hsxhp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.018046 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.018017 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.018138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.018048 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.018138 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.018118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.018243 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.018154 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.020301 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.020277 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87054e03-6255-407b-85c5-11b53b0c5f50-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.028750 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.028724 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hsxhp\" (UniqueName: \"kubernetes.io/projected/87054e03-6255-407b-85c5-11b53b0c5f50-kube-api-access-hsxhp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.132778 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.132748 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:44:39.272407 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:39.272377 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt"] Apr 24 21:44:39.275445 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:44:39.275413 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod87054e03_6255_407b_85c5_11b53b0c5f50.slice/crio-3a5ab143ae40897ce1c13c3b917cb685cb15aa647057d61a8757cba028d77f8b WatchSource:0}: Error finding container 3a5ab143ae40897ce1c13c3b917cb685cb15aa647057d61a8757cba028d77f8b: Status 404 returned error can't find the container with id 3a5ab143ae40897ce1c13c3b917cb685cb15aa647057d61a8757cba028d77f8b Apr 24 21:44:40.138487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:40.138453 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerStarted","Data":"fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818"} Apr 24 21:44:40.138487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:40.138490 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerStarted","Data":"3a5ab143ae40897ce1c13c3b917cb685cb15aa647057d61a8757cba028d77f8b"} Apr 24 21:44:41.143158 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:41.143117 2580 generic.go:358] "Generic (PLEG): container finished" podID="87054e03-6255-407b-85c5-11b53b0c5f50" containerID="fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818" exitCode=0 Apr 24 21:44:41.143536 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:41.143155 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerDied","Data":"fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818"} Apr 24 21:44:43.023540 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:43.023491 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:44:43.150932 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:43.150890 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerStarted","Data":"5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8"} Apr 24 21:44:53.023595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:44:53.023539 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:45:03.023568 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:03.023517 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:45:11.921590 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:11.921540 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt"] Apr 24 21:45:13.023213 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:13.023171 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:45:15.265675 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:15.265637 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerStarted","Data":"8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972"} Apr 24 21:45:15.266210 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:15.265842 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" containerID="cri-o://5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8" gracePeriod=30 Apr 24 21:45:15.266210 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:15.265886 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="tokenizer" containerID="cri-o://8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972" gracePeriod=30 Apr 24 21:45:15.266210 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:15.265921 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:45:15.269155 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:15.269119 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 24 21:45:15.290145 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:15.290087 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podStartSLOduration=3.583753071 podStartE2EDuration="37.290068036s" podCreationTimestamp="2026-04-24 21:44:38 +0000 UTC" firstStartedPulling="2026-04-24 21:44:41.14425556 +0000 UTC m=+1032.942318508" lastFinishedPulling="2026-04-24 21:45:14.850570525 +0000 UTC m=+1066.648633473" observedRunningTime="2026-04-24 21:45:15.288942544 +0000 UTC m=+1067.087005518" watchObservedRunningTime="2026-04-24 21:45:15.290068036 +0000 UTC m=+1067.088131005" Apr 24 21:45:16.270073 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:16.270031 2580 generic.go:358] "Generic (PLEG): container finished" podID="87054e03-6255-407b-85c5-11b53b0c5f50" containerID="5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8" exitCode=0 Apr 24 21:45:16.270459 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:16.270106 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerDied","Data":"5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8"} Apr 24 21:45:19.133591 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:19.133555 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:45:23.023623 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:23.023582 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:45:25.266709 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:45:25.266679 2580 logging.go:55] [core] [Channel #20 SubChannel #21]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 24 21:45:26.266626 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:26.266574 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 24 21:45:33.023554 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:33.023504 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:45:35.267170 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:45:35.267137 2580 logging.go:55] [core] [Channel #22 SubChannel #23]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 24 21:45:36.266738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:36.266697 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 24 21:45:43.024074 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:43.024028 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:45:45.267010 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:45:45.266976 2580 logging.go:55] [core] [Channel #24 SubChannel #25]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.30:9003", ServerName: "10.132.0.30:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.30:9003: connect: connection refused" Apr 24 21:45:45.927783 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:45.927752 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt_87054e03-6255-407b-85c5-11b53b0c5f50/tokenizer/0.log" Apr 24 21:45:45.928524 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:45.928501 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:45:46.025520 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.025482 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87054e03-6255-407b-85c5-11b53b0c5f50-tls-certs\") pod \"87054e03-6255-407b-85c5-11b53b0c5f50\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " Apr 24 21:45:46.025739 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.025566 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hsxhp\" (UniqueName: \"kubernetes.io/projected/87054e03-6255-407b-85c5-11b53b0c5f50-kube-api-access-hsxhp\") pod \"87054e03-6255-407b-85c5-11b53b0c5f50\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " Apr 24 21:45:46.025739 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.025646 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-tmp\") pod \"87054e03-6255-407b-85c5-11b53b0c5f50\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " Apr 24 21:45:46.025739 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.025671 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-kserve-provision-location\") pod \"87054e03-6255-407b-85c5-11b53b0c5f50\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " Apr 24 21:45:46.025739 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.025736 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-uds\") pod \"87054e03-6255-407b-85c5-11b53b0c5f50\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " Apr 24 21:45:46.025950 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.025774 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-cache\") pod \"87054e03-6255-407b-85c5-11b53b0c5f50\" (UID: \"87054e03-6255-407b-85c5-11b53b0c5f50\") " Apr 24 21:45:46.026094 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.026069 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "87054e03-6255-407b-85c5-11b53b0c5f50" (UID: "87054e03-6255-407b-85c5-11b53b0c5f50"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:46.026171 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.026112 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "87054e03-6255-407b-85c5-11b53b0c5f50" (UID: "87054e03-6255-407b-85c5-11b53b0c5f50"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:46.026217 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.026178 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "87054e03-6255-407b-85c5-11b53b0c5f50" (UID: "87054e03-6255-407b-85c5-11b53b0c5f50"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:46.026653 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.026622 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "87054e03-6255-407b-85c5-11b53b0c5f50" (UID: "87054e03-6255-407b-85c5-11b53b0c5f50"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:45:46.028042 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.028018 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/87054e03-6255-407b-85c5-11b53b0c5f50-kube-api-access-hsxhp" (OuterVolumeSpecName: "kube-api-access-hsxhp") pod "87054e03-6255-407b-85c5-11b53b0c5f50" (UID: "87054e03-6255-407b-85c5-11b53b0c5f50"). InnerVolumeSpecName "kube-api-access-hsxhp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:45:46.028042 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.028031 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/87054e03-6255-407b-85c5-11b53b0c5f50-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "87054e03-6255-407b-85c5-11b53b0c5f50" (UID: "87054e03-6255-407b-85c5-11b53b0c5f50"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:45:46.126487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.126457 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-uds\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:45:46.126487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.126487 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:45:46.126672 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.126497 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/87054e03-6255-407b-85c5-11b53b0c5f50-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:45:46.126672 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.126508 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hsxhp\" (UniqueName: \"kubernetes.io/projected/87054e03-6255-407b-85c5-11b53b0c5f50-kube-api-access-hsxhp\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:45:46.126672 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.126518 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-tokenizer-tmp\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:45:46.126672 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.126527 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/87054e03-6255-407b-85c5-11b53b0c5f50-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:45:46.267465 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.267415 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.30:9003\" within 1s: context deadline exceeded" Apr 24 21:45:46.371117 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.371089 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt_87054e03-6255-407b-85c5-11b53b0c5f50/tokenizer/0.log" Apr 24 21:45:46.371780 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.371756 2580 generic.go:358] "Generic (PLEG): container finished" podID="87054e03-6255-407b-85c5-11b53b0c5f50" containerID="8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972" exitCode=137 Apr 24 21:45:46.371866 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.371851 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" Apr 24 21:45:46.371952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.371849 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerDied","Data":"8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972"} Apr 24 21:45:46.372028 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.371980 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt" event={"ID":"87054e03-6255-407b-85c5-11b53b0c5f50","Type":"ContainerDied","Data":"3a5ab143ae40897ce1c13c3b917cb685cb15aa647057d61a8757cba028d77f8b"} Apr 24 21:45:46.372081 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.372028 2580 scope.go:117] "RemoveContainer" containerID="8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972" Apr 24 21:45:46.383069 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.381257 2580 scope.go:117] "RemoveContainer" containerID="5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8" Apr 24 21:45:46.390790 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.390772 2580 scope.go:117] "RemoveContainer" containerID="fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818" Apr 24 21:45:46.398056 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.398032 2580 scope.go:117] "RemoveContainer" containerID="8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972" Apr 24 21:45:46.398335 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:45:46.398310 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972\": container with ID starting with 8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972 not found: ID does not exist" containerID="8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972" Apr 24 21:45:46.398396 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.398346 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972"} err="failed to get container status \"8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972\": rpc error: code = NotFound desc = could not find container \"8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972\": container with ID starting with 8258be003c2805213bcf9bcdde564ba7e458afd7196c49142b251a319ece1972 not found: ID does not exist" Apr 24 21:45:46.398396 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.398369 2580 scope.go:117] "RemoveContainer" containerID="5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8" Apr 24 21:45:46.398600 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:45:46.398582 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8\": container with ID starting with 5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8 not found: ID does not exist" containerID="5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8" Apr 24 21:45:46.398664 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.398606 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8"} err="failed to get container status \"5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8\": rpc error: code = NotFound desc = could not find container \"5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8\": container with ID starting with 5bdc52ff1134e6f91de1b9bdffd7a602bbd09fbffd7fb51af64a8567c650a6e8 not found: ID does not exist" Apr 24 21:45:46.398664 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.398623 2580 scope.go:117] "RemoveContainer" containerID="fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818" Apr 24 21:45:46.398888 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:45:46.398868 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818\": container with ID starting with fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818 not found: ID does not exist" containerID="fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818" Apr 24 21:45:46.398952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.398897 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818"} err="failed to get container status \"fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818\": rpc error: code = NotFound desc = could not find container \"fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818\": container with ID starting with fe0b7c4e46a493bd2ea98ad9bc2dc18446d1afff641a0d2dec5ae9c128199818 not found: ID does not exist" Apr 24 21:45:46.401511 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.401488 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt"] Apr 24 21:45:46.404428 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.404407 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-76fdcc9xs7nt"] Apr 24 21:45:46.689152 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:46.689071 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" path="/var/lib/kubelet/pods/87054e03-6255-407b-85c5-11b53b0c5f50/volumes" Apr 24 21:45:53.023327 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:45:53.023274 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:46:03.023520 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:03.023464 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.29:8000/health\": dial tcp 10.132.0.29:8000: connect: connection refused" Apr 24 21:46:13.032859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:13.032827 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:46:13.040574 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:13.040550 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:46:18.743290 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:18.743255 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j"] Apr 24 21:46:18.743758 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:18.743509 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" containerID="cri-o://966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db" gracePeriod=30 Apr 24 21:46:28.013134 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013098 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk"] Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013416 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013427 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013444 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="tokenizer" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013449 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="tokenizer" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013460 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="storage-initializer" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013466 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="storage-initializer" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013519 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="tokenizer" Apr 24 21:46:28.013543 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.013527 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="87054e03-6255-407b-85c5-11b53b0c5f50" containerName="main" Apr 24 21:46:28.016424 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.016406 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.018805 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.018788 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 24 21:46:28.026078 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.026056 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk"] Apr 24 21:46:28.093362 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093323 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.093537 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093386 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tls-certs\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.093537 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093440 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-dshm\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.093537 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093456 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tmp-dir\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.093537 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-model-cache\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.093537 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093503 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-828dx\" (UniqueName: \"kubernetes.io/projected/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kube-api-access-828dx\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.093537 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.093524 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-home\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194028 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.193957 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194222 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194067 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tls-certs\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194222 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194123 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-dshm\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194222 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194145 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tmp-dir\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194222 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194176 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-model-cache\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194222 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194199 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-828dx\" (UniqueName: \"kubernetes.io/projected/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kube-api-access-828dx\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194466 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194224 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-home\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194466 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194398 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194578 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194559 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-home\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194628 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194592 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-model-cache\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.194698 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.194673 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tmp-dir\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.196436 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.196415 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-dshm\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.196686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.196669 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tls-certs\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.206826 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.206800 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-828dx\" (UniqueName: \"kubernetes.io/projected/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kube-api-access-828dx\") pod \"custom-route-timeout-test-kserve-7c6c7c46cb-clzrk\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.327529 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.327437 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:46:28.356576 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.356539 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2"] Apr 24 21:46:28.366623 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.366591 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.369882 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.369856 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-6vvsw\"" Apr 24 21:46:28.389416 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.389385 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2"] Apr 24 21:46:28.395853 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.395826 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8zw6r\" (UniqueName: \"kubernetes.io/projected/0d4f07e8-1268-45a4-ae07-a4ffba492419-kube-api-access-8zw6r\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.396026 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.395881 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.396026 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.395953 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4f07e8-1268-45a4-ae07-a4ffba492419-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.396026 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.395982 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.396198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.396059 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.396198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.396094 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.468514 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.468485 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk"] Apr 24 21:46:28.470729 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:46:28.470703 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podab72925a_e8f3_441b_bcd4_b38da39f5a11.slice/crio-f5cd7d610746099902d3f55eed0725651c3f72e43d16ada7a8e6209abc72ebc6 WatchSource:0}: Error finding container f5cd7d610746099902d3f55eed0725651c3f72e43d16ada7a8e6209abc72ebc6: Status 404 returned error can't find the container with id f5cd7d610746099902d3f55eed0725651c3f72e43d16ada7a8e6209abc72ebc6 Apr 24 21:46:28.472433 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.472416 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:46:28.497242 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497220 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4f07e8-1268-45a4-ae07-a4ffba492419-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497347 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497251 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497347 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497291 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497347 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497323 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497491 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497351 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8zw6r\" (UniqueName: \"kubernetes.io/projected/0d4f07e8-1268-45a4-ae07-a4ffba492419-kube-api-access-8zw6r\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497491 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497389 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497601 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497578 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497707 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497683 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497776 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497709 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.497776 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.497763 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.499585 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.499568 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4f07e8-1268-45a4-ae07-a4ffba492419-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.504758 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.504736 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" event={"ID":"ab72925a-e8f3-441b-bcd4-b38da39f5a11","Type":"ContainerStarted","Data":"f5cd7d610746099902d3f55eed0725651c3f72e43d16ada7a8e6209abc72ebc6"} Apr 24 21:46:28.509436 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.509417 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8zw6r\" (UniqueName: \"kubernetes.io/projected/0d4f07e8-1268-45a4-ae07-a4ffba492419-kube-api-access-8zw6r\") pod \"custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.679341 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.679301 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-6vvsw\"" Apr 24 21:46:28.687258 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.687234 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:28.829284 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:28.829219 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2"] Apr 24 21:46:28.831953 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:46:28.831924 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d4f07e8_1268_45a4_ae07_a4ffba492419.slice/crio-42e1b1c9037e12507b00195fedd711db81038f9a370dbdaff871ca1b0e9eea40 WatchSource:0}: Error finding container 42e1b1c9037e12507b00195fedd711db81038f9a370dbdaff871ca1b0e9eea40: Status 404 returned error can't find the container with id 42e1b1c9037e12507b00195fedd711db81038f9a370dbdaff871ca1b0e9eea40 Apr 24 21:46:29.510196 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:29.510147 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" event={"ID":"ab72925a-e8f3-441b-bcd4-b38da39f5a11","Type":"ContainerStarted","Data":"568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c"} Apr 24 21:46:29.511788 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:29.511760 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerStarted","Data":"030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef"} Apr 24 21:46:29.511911 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:29.511796 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerStarted","Data":"42e1b1c9037e12507b00195fedd711db81038f9a370dbdaff871ca1b0e9eea40"} Apr 24 21:46:30.516275 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:30.516242 2580 generic.go:358] "Generic (PLEG): container finished" podID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerID="030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef" exitCode=0 Apr 24 21:46:30.516651 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:30.516324 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerDied","Data":"030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef"} Apr 24 21:46:31.526930 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:31.526896 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerStarted","Data":"6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69"} Apr 24 21:46:31.526930 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:31.526930 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerStarted","Data":"4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c"} Apr 24 21:46:31.527349 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:31.527141 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:31.550581 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:31.550532 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" podStartSLOduration=3.550516312 podStartE2EDuration="3.550516312s" podCreationTimestamp="2026-04-24 21:46:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:46:31.548362761 +0000 UTC m=+1143.346425749" watchObservedRunningTime="2026-04-24 21:46:31.550516312 +0000 UTC m=+1143.348579281" Apr 24 21:46:38.687987 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:38.687951 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:38.688361 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:38.688018 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:38.690150 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:38.690122 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:39.554764 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:39.554734 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:46:49.016773 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.016752 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j_83bed8e4-5185-4c49-943a-0b47d64de7c9/main/0.log" Apr 24 21:46:49.017183 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.017166 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:46:49.080839 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.080811 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-dshm\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.080967 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.080855 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-tmp-dir\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.080967 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.080878 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5jdng\" (UniqueName: \"kubernetes.io/projected/83bed8e4-5185-4c49-943a-0b47d64de7c9-kube-api-access-5jdng\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.080967 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.080930 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-kserve-provision-location\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.080967 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.080953 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-model-cache\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.081212 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.080985 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83bed8e4-5185-4c49-943a-0b47d64de7c9-tls-certs\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.081212 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.081077 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-home\") pod \"83bed8e4-5185-4c49-943a-0b47d64de7c9\" (UID: \"83bed8e4-5185-4c49-943a-0b47d64de7c9\") " Apr 24 21:46:49.081318 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.081221 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-model-cache" (OuterVolumeSpecName: "model-cache") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:49.081376 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.081348 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.081868 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.081844 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-home" (OuterVolumeSpecName: "home") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:49.083227 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.083199 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/83bed8e4-5185-4c49-943a-0b47d64de7c9-kube-api-access-5jdng" (OuterVolumeSpecName: "kube-api-access-5jdng") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "kube-api-access-5jdng". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:46:49.083430 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.083413 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-dshm" (OuterVolumeSpecName: "dshm") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:49.083577 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.083554 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/83bed8e4-5185-4c49-943a-0b47d64de7c9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:46:49.093524 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.093500 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:49.136886 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.136848 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "83bed8e4-5185-4c49-943a-0b47d64de7c9" (UID: "83bed8e4-5185-4c49-943a-0b47d64de7c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:46:49.181742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.181708 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.181742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.181739 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5jdng\" (UniqueName: \"kubernetes.io/projected/83bed8e4-5185-4c49-943a-0b47d64de7c9-kube-api-access-5jdng\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.181902 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.181753 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.181902 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.181762 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/83bed8e4-5185-4c49-943a-0b47d64de7c9-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.181902 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.181771 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.181902 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.181780 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/83bed8e4-5185-4c49-943a-0b47d64de7c9-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:46:49.588039 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.587986 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j_83bed8e4-5185-4c49-943a-0b47d64de7c9/main/0.log" Apr 24 21:46:49.588360 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.588338 2580 generic.go:358] "Generic (PLEG): container finished" podID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerID="966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db" exitCode=137 Apr 24 21:46:49.588444 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.588426 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" Apr 24 21:46:49.588444 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.588426 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" event={"ID":"83bed8e4-5185-4c49-943a-0b47d64de7c9","Type":"ContainerDied","Data":"966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db"} Apr 24 21:46:49.588525 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.588473 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j" event={"ID":"83bed8e4-5185-4c49-943a-0b47d64de7c9","Type":"ContainerDied","Data":"f37749baf09199cf6183a34cc87246e68f438dbf2b96cbedf39193cc77d24f5b"} Apr 24 21:46:49.588525 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.588494 2580 scope.go:117] "RemoveContainer" containerID="966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db" Apr 24 21:46:49.597225 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.597207 2580 scope.go:117] "RemoveContainer" containerID="ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7" Apr 24 21:46:49.611024 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.611010 2580 scope.go:117] "RemoveContainer" containerID="966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db" Apr 24 21:46:49.611312 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:46:49.611284 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db\": container with ID starting with 966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db not found: ID does not exist" containerID="966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db" Apr 24 21:46:49.611423 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.611331 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db"} err="failed to get container status \"966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db\": rpc error: code = NotFound desc = could not find container \"966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db\": container with ID starting with 966c38eb597b1acc4d69f5e1d76f4bb64ccf203a951298387b909b9b1e5688db not found: ID does not exist" Apr 24 21:46:49.611423 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.611354 2580 scope.go:117] "RemoveContainer" containerID="ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7" Apr 24 21:46:49.611664 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:46:49.611632 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7\": container with ID starting with ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7 not found: ID does not exist" containerID="ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7" Apr 24 21:46:49.611738 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.611673 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7"} err="failed to get container status \"ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7\": rpc error: code = NotFound desc = could not find container \"ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7\": container with ID starting with ec14bb17dfb82ae46ae7d89c1c51623c84976de9b5cf8a86993f5083894af8d7 not found: ID does not exist" Apr 24 21:46:49.612512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.612492 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j"] Apr 24 21:46:49.617789 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:49.617766 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-59d97cbbd8zvq5j"] Apr 24 21:46:50.688237 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:46:50.688197 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" path="/var/lib/kubelet/pods/83bed8e4-5185-4c49-943a-0b47d64de7c9/volumes" Apr 24 21:47:00.558579 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:00.558551 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:47:28.678268 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:28.678237 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:47:28.678895 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:28.678876 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:47:28.684102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:28.684076 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:47:28.690025 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:28.689963 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:47:46.764774 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:46.764735 2580 generic.go:358] "Generic (PLEG): container finished" podID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerID="568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c" exitCode=0 Apr 24 21:47:46.765230 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:46.764812 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" event={"ID":"ab72925a-e8f3-441b-bcd4-b38da39f5a11","Type":"ContainerDied","Data":"568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c"} Apr 24 21:47:47.770056 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:47.770010 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" event={"ID":"ab72925a-e8f3-441b-bcd4-b38da39f5a11","Type":"ContainerStarted","Data":"6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d"} Apr 24 21:47:47.791816 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:47.791752 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podStartSLOduration=80.791730914 podStartE2EDuration="1m20.791730914s" podCreationTimestamp="2026-04-24 21:46:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:47:47.791307592 +0000 UTC m=+1219.589370563" watchObservedRunningTime="2026-04-24 21:47:47.791730914 +0000 UTC m=+1219.589793885" Apr 24 21:47:48.327832 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:48.327790 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:47:48.328042 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:48.327848 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:47:48.329569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:48.329535 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:47:58.328866 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:47:58.328807 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:48:08.328643 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.328591 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:48:08.823867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.823831 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp"] Apr 24 21:48:08.824206 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.824191 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" Apr 24 21:48:08.824270 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.824208 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" Apr 24 21:48:08.824270 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.824223 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="storage-initializer" Apr 24 21:48:08.824270 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.824233 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="storage-initializer" Apr 24 21:48:08.824360 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.824296 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="83bed8e4-5185-4c49-943a-0b47d64de7c9" containerName="main" Apr 24 21:48:08.829014 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.828982 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.831359 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.831339 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-lora-crit-kserve-self-signed-certs\"" Apr 24 21:48:08.839988 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.839965 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp"] Apr 24 21:48:08.848676 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848643 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxzdb\" (UniqueName: \"kubernetes.io/projected/ff901051-be5e-4601-9795-026c725b8317-kube-api-access-hxzdb\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.848805 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848694 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff901051-be5e-4601-9795-026c725b8317-tls-certs\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.848805 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848735 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.848924 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848834 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-home\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.848924 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848863 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-dshm\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.848924 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848885 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-tmp-dir\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.849111 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.848926 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-model-cache\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.949715 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.949681 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.949891 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.949744 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-home\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.949891 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.949849 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-dshm\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.949891 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.949881 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-tmp-dir\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950075 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.949918 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-model-cache\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950075 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.949989 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxzdb\" (UniqueName: \"kubernetes.io/projected/ff901051-be5e-4601-9795-026c725b8317-kube-api-access-hxzdb\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950075 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.950054 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff901051-be5e-4601-9795-026c725b8317-tls-certs\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950227 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.950173 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-home\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950294 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.950265 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950436 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.950307 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-tmp-dir\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.950577 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.950555 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-model-cache\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.952145 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.952118 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-dshm\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.952451 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.952432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff901051-be5e-4601-9795-026c725b8317-tls-certs\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:08.958478 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:08.958460 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxzdb\" (UniqueName: \"kubernetes.io/projected/ff901051-be5e-4601-9795-026c725b8317-kube-api-access-hxzdb\") pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:09.140048 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:09.140012 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:09.276700 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:09.276636 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp"] Apr 24 21:48:09.279535 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:48:09.279504 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podff901051_be5e_4601_9795_026c725b8317.slice/crio-db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff WatchSource:0}: Error finding container db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff: Status 404 returned error can't find the container with id db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff Apr 24 21:48:09.847366 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:09.847340 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-568c94495b-pmqkp_ff901051-be5e-4601-9795-026c725b8317/storage-initializer/0.log" Apr 24 21:48:09.847728 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:09.847382 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff901051-be5e-4601-9795-026c725b8317" containerID="32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e" exitCode=1 Apr 24 21:48:09.847728 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:09.847463 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" event={"ID":"ff901051-be5e-4601-9795-026c725b8317","Type":"ContainerDied","Data":"32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e"} Apr 24 21:48:09.847728 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:09.847497 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" event={"ID":"ff901051-be5e-4601-9795-026c725b8317","Type":"ContainerStarted","Data":"db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff"} Apr 24 21:48:10.851692 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:10.851606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-568c94495b-pmqkp_ff901051-be5e-4601-9795-026c725b8317/storage-initializer/1.log" Apr 24 21:48:10.852097 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:10.852076 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-568c94495b-pmqkp_ff901051-be5e-4601-9795-026c725b8317/storage-initializer/0.log" Apr 24 21:48:10.852164 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:10.852123 2580 generic.go:358] "Generic (PLEG): container finished" podID="ff901051-be5e-4601-9795-026c725b8317" containerID="adf8189afd06f67849d53553fb8876ffee525985a93c094103c6f2e65982dd48" exitCode=1 Apr 24 21:48:10.852235 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:10.852211 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" event={"ID":"ff901051-be5e-4601-9795-026c725b8317","Type":"ContainerDied","Data":"adf8189afd06f67849d53553fb8876ffee525985a93c094103c6f2e65982dd48"} Apr 24 21:48:10.852289 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:10.852258 2580 scope.go:117] "RemoveContainer" containerID="32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e" Apr 24 21:48:10.852557 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:10.852539 2580 scope.go:117] "RemoveContainer" containerID="32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e" Apr 24 21:48:10.864194 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:48:10.864165 2580 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-568c94495b-pmqkp_kserve-ci-e2e-test_ff901051-be5e-4601-9795-026c725b8317_0 in pod sandbox db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff from index: no such id: '32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e'" containerID="32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e" Apr 24 21:48:10.864288 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:48:10.864217 2580 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-568c94495b-pmqkp_kserve-ci-e2e-test_ff901051-be5e-4601-9795-026c725b8317_0 in pod sandbox db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff from index: no such id: '32d0e3e100f82c93ce539dc4fb52866fb2e1f0f6ad9fd4b329a85b5905c9c72e'; Skipping pod \"conv-test-lora-crit-kserve-568c94495b-pmqkp_kserve-ci-e2e-test(ff901051-be5e-4601-9795-026c725b8317)\"" logger="UnhandledError" Apr 24 21:48:10.865521 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:48:10.865494 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-568c94495b-pmqkp_kserve-ci-e2e-test(ff901051-be5e-4601-9795-026c725b8317)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" podUID="ff901051-be5e-4601-9795-026c725b8317" Apr 24 21:48:11.856550 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:11.856523 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-568c94495b-pmqkp_ff901051-be5e-4601-9795-026c725b8317/storage-initializer/1.log" Apr 24 21:48:11.857086 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:48:11.857067 2580 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-568c94495b-pmqkp_kserve-ci-e2e-test(ff901051-be5e-4601-9795-026c725b8317)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" podUID="ff901051-be5e-4601-9795-026c725b8317" Apr 24 21:48:17.411724 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.411647 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp"] Apr 24 21:48:17.417066 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.417040 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.422328 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.422303 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 24 21:48:17.459783 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.459758 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp"] Apr 24 21:48:17.516093 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516060 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-home\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.516093 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516099 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7595a027-c4cd-4e80-adfc-5b9512847709-tls-certs\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.516296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516116 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-dshm\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.516296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516173 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-tmp-dir\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.516296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-model-cache\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.516296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516244 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-kserve-provision-location\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.516296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.516259 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rdfmq\" (UniqueName: \"kubernetes.io/projected/7595a027-c4cd-4e80-adfc-5b9512847709-kube-api-access-rdfmq\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617375 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617322 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-model-cache\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617375 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617384 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-kserve-provision-location\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617413 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rdfmq\" (UniqueName: \"kubernetes.io/projected/7595a027-c4cd-4e80-adfc-5b9512847709-kube-api-access-rdfmq\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617567 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-home\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617618 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7595a027-c4cd-4e80-adfc-5b9512847709-tls-certs\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617786 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617648 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-dshm\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617786 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617702 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-tmp-dir\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617966 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617828 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-model-cache\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617966 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617882 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-kserve-provision-location\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.617966 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.617906 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-home\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.618154 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.618115 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-tmp-dir\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.619963 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.619935 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-dshm\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.620346 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.620327 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7595a027-c4cd-4e80-adfc-5b9512847709-tls-certs\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.627753 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.627731 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rdfmq\" (UniqueName: \"kubernetes.io/projected/7595a027-c4cd-4e80-adfc-5b9512847709-kube-api-access-rdfmq\") pod \"stop-feature-test-kserve-8446b959c6-fxpwp\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:17.727497 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:17.727420 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:18.067173 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:18.067138 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp"] Apr 24 21:48:18.069053 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:48:18.069022 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7595a027_c4cd_4e80_adfc_5b9512847709.slice/crio-6b422844f551823e2240d40b0fbf0be22b9dd4a6331b6e648a443b986cb92f1b WatchSource:0}: Error finding container 6b422844f551823e2240d40b0fbf0be22b9dd4a6331b6e648a443b986cb92f1b: Status 404 returned error can't find the container with id 6b422844f551823e2240d40b0fbf0be22b9dd4a6331b6e648a443b986cb92f1b Apr 24 21:48:18.328211 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:18.328107 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:48:18.882636 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:18.882598 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" event={"ID":"7595a027-c4cd-4e80-adfc-5b9512847709","Type":"ContainerStarted","Data":"fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4"} Apr 24 21:48:18.882636 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:18.882640 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" event={"ID":"7595a027-c4cd-4e80-adfc-5b9512847709","Type":"ContainerStarted","Data":"6b422844f551823e2240d40b0fbf0be22b9dd4a6331b6e648a443b986cb92f1b"} Apr 24 21:48:20.586251 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.586202 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp"] Apr 24 21:48:20.740461 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.740419 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-568c94495b-pmqkp_ff901051-be5e-4601-9795-026c725b8317/storage-initializer/1.log" Apr 24 21:48:20.740619 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.740485 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:20.744165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744141 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff901051-be5e-4601-9795-026c725b8317-tls-certs\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744302 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744171 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-home\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744302 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744190 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-model-cache\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744302 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744222 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-dshm\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744302 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744256 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-hxzdb\" (UniqueName: \"kubernetes.io/projected/ff901051-be5e-4601-9795-026c725b8317-kube-api-access-hxzdb\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744484 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744306 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-tmp-dir\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744484 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744384 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-kserve-provision-location\") pod \"ff901051-be5e-4601-9795-026c725b8317\" (UID: \"ff901051-be5e-4601-9795-026c725b8317\") " Apr 24 21:48:20.744846 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744620 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-model-cache" (OuterVolumeSpecName: "model-cache") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:20.744846 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744642 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-home" (OuterVolumeSpecName: "home") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:20.744846 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.744813 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:20.745320 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.745292 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:20.746813 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.746782 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ff901051-be5e-4601-9795-026c725b8317-kube-api-access-hxzdb" (OuterVolumeSpecName: "kube-api-access-hxzdb") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "kube-api-access-hxzdb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:48:20.747678 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.747654 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-dshm" (OuterVolumeSpecName: "dshm") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:48:20.747770 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.747670 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ff901051-be5e-4601-9795-026c725b8317-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ff901051-be5e-4601-9795-026c725b8317" (UID: "ff901051-be5e-4601-9795-026c725b8317"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846036 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846069 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-hxzdb\" (UniqueName: \"kubernetes.io/projected/ff901051-be5e-4601-9795-026c725b8317-kube-api-access-hxzdb\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846080 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846091 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846101 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ff901051-be5e-4601-9795-026c725b8317-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846110 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.846119 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.846118 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ff901051-be5e-4601-9795-026c725b8317-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:48:20.890926 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.890885 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-568c94495b-pmqkp_ff901051-be5e-4601-9795-026c725b8317/storage-initializer/1.log" Apr 24 21:48:20.891135 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.890961 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" event={"ID":"ff901051-be5e-4601-9795-026c725b8317","Type":"ContainerDied","Data":"db5bad269865a1da8cf9563e63aa3844a4842bec910fc78a7cdc9e5f70fa42ff"} Apr 24 21:48:20.891135 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.891005 2580 scope.go:117] "RemoveContainer" containerID="adf8189afd06f67849d53553fb8876ffee525985a93c094103c6f2e65982dd48" Apr 24 21:48:20.891135 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.891027 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp" Apr 24 21:48:20.944356 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.944316 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp"] Apr 24 21:48:20.951143 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:20.951114 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-568c94495b-pmqkp"] Apr 24 21:48:22.687858 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:22.687825 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ff901051-be5e-4601-9795-026c725b8317" path="/var/lib/kubelet/pods/ff901051-be5e-4601-9795-026c725b8317/volumes" Apr 24 21:48:22.900412 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:22.900373 2580 generic.go:358] "Generic (PLEG): container finished" podID="7595a027-c4cd-4e80-adfc-5b9512847709" containerID="fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4" exitCode=0 Apr 24 21:48:22.900609 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:22.900446 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" event={"ID":"7595a027-c4cd-4e80-adfc-5b9512847709","Type":"ContainerDied","Data":"fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4"} Apr 24 21:48:23.906014 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:23.905959 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" event={"ID":"7595a027-c4cd-4e80-adfc-5b9512847709","Type":"ContainerStarted","Data":"9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3"} Apr 24 21:48:23.930045 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:23.929954 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podStartSLOduration=6.929937476 podStartE2EDuration="6.929937476s" podCreationTimestamp="2026-04-24 21:48:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:48:23.928576108 +0000 UTC m=+1255.726639115" watchObservedRunningTime="2026-04-24 21:48:23.929937476 +0000 UTC m=+1255.728000459" Apr 24 21:48:27.728394 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:27.728350 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:27.728394 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:27.728409 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:48:27.731066 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:27.729962 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:28.328222 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:28.328173 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:48:37.728252 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:37.728200 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:38.328880 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:38.328825 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:48:47.728906 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:47.728849 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:48.328032 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:48.327963 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:48:57.728881 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:57.728816 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:48:58.328580 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:48:58.328529 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:49:07.728773 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:07.728718 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:08.328419 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:08.328378 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:49:17.728719 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:17.728666 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:18.328121 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:18.328068 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" probeResult="failure" output="Get \"https://10.132.0.31:8000/health\": dial tcp 10.132.0.31:8000: connect: connection refused" Apr 24 21:49:27.728721 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:27.728672 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:28.338295 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:28.338257 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:49:28.347069 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:28.347028 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:49:37.728612 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:37.728565 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:37.803398 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:37.803353 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2"] Apr 24 21:49:37.803769 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:37.803736 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="main" containerID="cri-o://4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c" gracePeriod=30 Apr 24 21:49:37.803917 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:37.803781 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="tokenizer" containerID="cri-o://6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69" gracePeriod=30 Apr 24 21:49:37.805607 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:37.805581 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk"] Apr 24 21:49:37.806228 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:37.806138 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" containerID="cri-o://6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d" gracePeriod=30 Apr 24 21:49:38.179750 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:38.179711 2580 generic.go:358] "Generic (PLEG): container finished" podID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerID="4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c" exitCode=0 Apr 24 21:49:38.179933 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:38.179780 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerDied","Data":"4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c"} Apr 24 21:49:39.055142 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.055120 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:49:39.154495 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154467 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8zw6r\" (UniqueName: \"kubernetes.io/projected/0d4f07e8-1268-45a4-ae07-a4ffba492419-kube-api-access-8zw6r\") pod \"0d4f07e8-1268-45a4-ae07-a4ffba492419\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " Apr 24 21:49:39.154669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154562 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-cache\") pod \"0d4f07e8-1268-45a4-ae07-a4ffba492419\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " Apr 24 21:49:39.154669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154603 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-uds\") pod \"0d4f07e8-1268-45a4-ae07-a4ffba492419\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " Apr 24 21:49:39.154669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154637 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-tmp\") pod \"0d4f07e8-1268-45a4-ae07-a4ffba492419\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " Apr 24 21:49:39.154669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154664 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4f07e8-1268-45a4-ae07-a4ffba492419-tls-certs\") pod \"0d4f07e8-1268-45a4-ae07-a4ffba492419\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " Apr 24 21:49:39.154866 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154704 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-kserve-provision-location\") pod \"0d4f07e8-1268-45a4-ae07-a4ffba492419\" (UID: \"0d4f07e8-1268-45a4-ae07-a4ffba492419\") " Apr 24 21:49:39.154915 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154865 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "0d4f07e8-1268-45a4-ae07-a4ffba492419" (UID: "0d4f07e8-1268-45a4-ae07-a4ffba492419"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:39.154915 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154876 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "0d4f07e8-1268-45a4-ae07-a4ffba492419" (UID: "0d4f07e8-1268-45a4-ae07-a4ffba492419"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:39.155056 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.154960 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "0d4f07e8-1268-45a4-ae07-a4ffba492419" (UID: "0d4f07e8-1268-45a4-ae07-a4ffba492419"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:39.155056 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.155023 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:49:39.155056 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.155041 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-uds\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:49:39.155576 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.155553 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "0d4f07e8-1268-45a4-ae07-a4ffba492419" (UID: "0d4f07e8-1268-45a4-ae07-a4ffba492419"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:49:39.156816 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.156784 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0d4f07e8-1268-45a4-ae07-a4ffba492419-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "0d4f07e8-1268-45a4-ae07-a4ffba492419" (UID: "0d4f07e8-1268-45a4-ae07-a4ffba492419"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:49:39.156872 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.156837 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0d4f07e8-1268-45a4-ae07-a4ffba492419-kube-api-access-8zw6r" (OuterVolumeSpecName: "kube-api-access-8zw6r") pod "0d4f07e8-1268-45a4-ae07-a4ffba492419" (UID: "0d4f07e8-1268-45a4-ae07-a4ffba492419"). InnerVolumeSpecName "kube-api-access-8zw6r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:49:39.184225 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.184199 2580 generic.go:358] "Generic (PLEG): container finished" podID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerID="6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69" exitCode=0 Apr 24 21:49:39.184356 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.184286 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerDied","Data":"6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69"} Apr 24 21:49:39.184356 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.184307 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" Apr 24 21:49:39.184356 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.184329 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2" event={"ID":"0d4f07e8-1268-45a4-ae07-a4ffba492419","Type":"ContainerDied","Data":"42e1b1c9037e12507b00195fedd711db81038f9a370dbdaff871ca1b0e9eea40"} Apr 24 21:49:39.184356 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.184350 2580 scope.go:117] "RemoveContainer" containerID="6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69" Apr 24 21:49:39.191877 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.191860 2580 scope.go:117] "RemoveContainer" containerID="4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c" Apr 24 21:49:39.199268 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.199253 2580 scope.go:117] "RemoveContainer" containerID="030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef" Apr 24 21:49:39.206749 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.206727 2580 scope.go:117] "RemoveContainer" containerID="6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69" Apr 24 21:49:39.206976 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:49:39.206961 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69\": container with ID starting with 6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69 not found: ID does not exist" containerID="6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69" Apr 24 21:49:39.207047 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.206983 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69"} err="failed to get container status \"6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69\": rpc error: code = NotFound desc = could not find container \"6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69\": container with ID starting with 6ca9c414dfe83172c66a209a82b1f157ac27743c10baeb8488f9d42494f8ee69 not found: ID does not exist" Apr 24 21:49:39.207047 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.207015 2580 scope.go:117] "RemoveContainer" containerID="4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c" Apr 24 21:49:39.207242 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:49:39.207227 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c\": container with ID starting with 4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c not found: ID does not exist" containerID="4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c" Apr 24 21:49:39.207286 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.207245 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c"} err="failed to get container status \"4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c\": rpc error: code = NotFound desc = could not find container \"4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c\": container with ID starting with 4cfec5e638cefc90d8d7d128a9ac00cc383a4249cab0ee8f39972ee18e674e1c not found: ID does not exist" Apr 24 21:49:39.207286 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.207258 2580 scope.go:117] "RemoveContainer" containerID="030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef" Apr 24 21:49:39.207504 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:49:39.207484 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef\": container with ID starting with 030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef not found: ID does not exist" containerID="030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef" Apr 24 21:49:39.207549 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.207512 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef"} err="failed to get container status \"030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef\": rpc error: code = NotFound desc = could not find container \"030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef\": container with ID starting with 030494ad27b7b9e5081e3d90f94cf67fc96c95f185badcbd310968e591f739ef not found: ID does not exist" Apr 24 21:49:39.215884 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.215863 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2"] Apr 24 21:49:39.223553 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.223533 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-657d5b7c4x8h2"] Apr 24 21:49:39.256227 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.256205 2580 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-tokenizer-tmp\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:49:39.256227 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.256225 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/0d4f07e8-1268-45a4-ae07-a4ffba492419-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:49:39.256359 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.256235 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/0d4f07e8-1268-45a4-ae07-a4ffba492419-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:49:39.256359 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:39.256245 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8zw6r\" (UniqueName: \"kubernetes.io/projected/0d4f07e8-1268-45a4-ae07-a4ffba492419-kube-api-access-8zw6r\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:49:40.688616 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:40.688582 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" path="/var/lib/kubelet/pods/0d4f07e8-1268-45a4-ae07-a4ffba492419/volumes" Apr 24 21:49:47.728612 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:47.728523 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:49:50.812913 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.812886 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg"] Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813215 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="main" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813227 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="main" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813237 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff901051-be5e-4601-9795-026c725b8317" containerName="storage-initializer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813243 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff901051-be5e-4601-9795-026c725b8317" containerName="storage-initializer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813263 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="tokenizer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813268 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="tokenizer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813277 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="storage-initializer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813287 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="storage-initializer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813300 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ff901051-be5e-4601-9795-026c725b8317" containerName="storage-initializer" Apr 24 21:49:50.813305 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813305 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ff901051-be5e-4601-9795-026c725b8317" containerName="storage-initializer" Apr 24 21:49:50.813638 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813351 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff901051-be5e-4601-9795-026c725b8317" containerName="storage-initializer" Apr 24 21:49:50.813638 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813364 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="main" Apr 24 21:49:50.813638 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813371 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="0d4f07e8-1268-45a4-ae07-a4ffba492419" containerName="tokenizer" Apr 24 21:49:50.813638 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.813473 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ff901051-be5e-4601-9795-026c725b8317" containerName="storage-initializer" Apr 24 21:49:50.817596 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.817579 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.820734 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.820715 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 24 21:49:50.829270 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.829250 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg"] Apr 24 21:49:50.847839 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.847748 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-tmp-dir\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.847839 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.847827 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/742a75c5-0802-4047-8b47-9600e7751e1e-tls-certs\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.848127 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.847857 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-model-cache\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.848127 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.847881 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-kserve-provision-location\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.848127 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.847930 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-home\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.848127 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.847964 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qq97b\" (UniqueName: \"kubernetes.io/projected/742a75c5-0802-4047-8b47-9600e7751e1e-kube-api-access-qq97b\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.848127 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.848018 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-dshm\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949510 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949475 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-home\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949654 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949516 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qq97b\" (UniqueName: \"kubernetes.io/projected/742a75c5-0802-4047-8b47-9600e7751e1e-kube-api-access-qq97b\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949654 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949560 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-dshm\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949654 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949624 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-tmp-dir\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949811 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949677 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/742a75c5-0802-4047-8b47-9600e7751e1e-tls-certs\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949811 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949703 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-model-cache\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949811 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949792 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-kserve-provision-location\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.949947 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.949864 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-home\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.950102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.950065 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-model-cache\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.950220 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.950121 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-tmp-dir\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.950307 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.950286 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-kserve-provision-location\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.951841 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.951822 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-dshm\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.952206 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.952186 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/742a75c5-0802-4047-8b47-9600e7751e1e-tls-certs\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:50.958185 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:50.958167 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qq97b\" (UniqueName: \"kubernetes.io/projected/742a75c5-0802-4047-8b47-9600e7751e1e-kube-api-access-qq97b\") pod \"router-with-refs-test-kserve-5d74d49756-cw8kg\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:51.128980 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:51.128947 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:49:51.261205 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:51.261180 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg"] Apr 24 21:49:51.263803 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:49:51.263775 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod742a75c5_0802_4047_8b47_9600e7751e1e.slice/crio-8c087b38dc380babe0eb98e8c02dc59128ccb54769a0009e2a1019dd55636f29 WatchSource:0}: Error finding container 8c087b38dc380babe0eb98e8c02dc59128ccb54769a0009e2a1019dd55636f29: Status 404 returned error can't find the container with id 8c087b38dc380babe0eb98e8c02dc59128ccb54769a0009e2a1019dd55636f29 Apr 24 21:49:52.230905 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:52.230870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" event={"ID":"742a75c5-0802-4047-8b47-9600e7751e1e","Type":"ContainerStarted","Data":"76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017"} Apr 24 21:49:52.230905 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:52.230909 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" event={"ID":"742a75c5-0802-4047-8b47-9600e7751e1e","Type":"ContainerStarted","Data":"8c087b38dc380babe0eb98e8c02dc59128ccb54769a0009e2a1019dd55636f29"} Apr 24 21:49:57.728346 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:49:57.728303 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" probeResult="failure" output="Get \"https://10.132.0.34:8000/health\": dial tcp 10.132.0.34:8000: connect: connection refused" Apr 24 21:50:04.274603 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:04.274568 2580 generic.go:358] "Generic (PLEG): container finished" podID="742a75c5-0802-4047-8b47-9600e7751e1e" containerID="76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017" exitCode=0 Apr 24 21:50:04.275034 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:04.274646 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" event={"ID":"742a75c5-0802-4047-8b47-9600e7751e1e","Type":"ContainerDied","Data":"76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017"} Apr 24 21:50:05.282617 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:05.282583 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" event={"ID":"742a75c5-0802-4047-8b47-9600e7751e1e","Type":"ContainerStarted","Data":"46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7"} Apr 24 21:50:05.306890 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:05.306834 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podStartSLOduration=15.306818296 podStartE2EDuration="15.306818296s" podCreationTimestamp="2026-04-24 21:49:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:05.305058928 +0000 UTC m=+1357.103121901" watchObservedRunningTime="2026-04-24 21:50:05.306818296 +0000 UTC m=+1357.104881266" Apr 24 21:50:07.738316 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:07.738278 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:50:07.746598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:07.746566 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:50:08.050502 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.050480 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-7c6c7c46cb-clzrk_ab72925a-e8f3-441b-bcd4-b38da39f5a11/main/0.log" Apr 24 21:50:08.050879 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.050860 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:50:08.100529 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100497 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kserve-provision-location\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.100723 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100540 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-828dx\" (UniqueName: \"kubernetes.io/projected/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kube-api-access-828dx\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.100723 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100589 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tls-certs\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.100723 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100634 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-dshm\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.100723 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100675 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-model-cache\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.100915 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100870 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tmp-dir\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.100972 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.100935 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-home\") pod \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\" (UID: \"ab72925a-e8f3-441b-bcd4-b38da39f5a11\") " Apr 24 21:50:08.103273 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.101160 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-model-cache" (OuterVolumeSpecName: "model-cache") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:08.103273 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.101517 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.103273 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.101645 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-home" (OuterVolumeSpecName: "home") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:08.103273 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.103049 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kube-api-access-828dx" (OuterVolumeSpecName: "kube-api-access-828dx") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "kube-api-access-828dx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:08.103273 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.103141 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:08.103669 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.103643 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-dshm" (OuterVolumeSpecName: "dshm") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:08.119287 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.119248 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:08.168011 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.167967 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ab72925a-e8f3-441b-bcd4-b38da39f5a11" (UID: "ab72925a-e8f3-441b-bcd4-b38da39f5a11"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:08.202946 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.202907 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.202946 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.202940 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.202946 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.202949 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.202946 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.202957 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.203260 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.202968 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.203260 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.202978 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-828dx\" (UniqueName: \"kubernetes.io/projected/ab72925a-e8f3-441b-bcd4-b38da39f5a11-kube-api-access-828dx\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:08.296634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.296557 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-7c6c7c46cb-clzrk_ab72925a-e8f3-441b-bcd4-b38da39f5a11/main/0.log" Apr 24 21:50:08.296931 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.296907 2580 generic.go:358] "Generic (PLEG): container finished" podID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerID="6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d" exitCode=137 Apr 24 21:50:08.297043 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.297009 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" event={"ID":"ab72925a-e8f3-441b-bcd4-b38da39f5a11","Type":"ContainerDied","Data":"6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d"} Apr 24 21:50:08.297102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.297035 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" Apr 24 21:50:08.297102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.297056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk" event={"ID":"ab72925a-e8f3-441b-bcd4-b38da39f5a11","Type":"ContainerDied","Data":"f5cd7d610746099902d3f55eed0725651c3f72e43d16ada7a8e6209abc72ebc6"} Apr 24 21:50:08.297102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.297079 2580 scope.go:117] "RemoveContainer" containerID="6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d" Apr 24 21:50:08.305714 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.305696 2580 scope.go:117] "RemoveContainer" containerID="568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c" Apr 24 21:50:08.314790 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.314775 2580 scope.go:117] "RemoveContainer" containerID="6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d" Apr 24 21:50:08.315051 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:50:08.315032 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d\": container with ID starting with 6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d not found: ID does not exist" containerID="6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d" Apr 24 21:50:08.315126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.315063 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d"} err="failed to get container status \"6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d\": rpc error: code = NotFound desc = could not find container \"6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d\": container with ID starting with 6b44f486b2c30d77a4857a8cb07bba91d49ca10fff06ba37a6041c414296ed0d not found: ID does not exist" Apr 24 21:50:08.315126 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.315087 2580 scope.go:117] "RemoveContainer" containerID="568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c" Apr 24 21:50:08.315355 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:50:08.315329 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c\": container with ID starting with 568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c not found: ID does not exist" containerID="568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c" Apr 24 21:50:08.315395 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.315360 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c"} err="failed to get container status \"568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c\": rpc error: code = NotFound desc = could not find container \"568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c\": container with ID starting with 568221ad47b4234146cf6cb8e1ac3f440e9c3513fab0eca486efa73b2be6a98c not found: ID does not exist" Apr 24 21:50:08.321265 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.321236 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk"] Apr 24 21:50:08.327699 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.327676 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-7c6c7c46cb-clzrk"] Apr 24 21:50:08.689281 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.689249 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" path="/var/lib/kubelet/pods/ab72925a-e8f3-441b-bcd4-b38da39f5a11/volumes" Apr 24 21:50:08.689554 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:08.689539 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp"] Apr 24 21:50:09.301859 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:09.301794 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" containerID="cri-o://9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3" gracePeriod=30 Apr 24 21:50:11.129187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:11.129155 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:50:11.129187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:11.129194 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:50:11.130781 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:11.130751 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:50:21.129721 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:21.129680 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:50:25.343867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.343826 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8"] Apr 24 21:50:25.344262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.344176 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" Apr 24 21:50:25.344262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.344188 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" Apr 24 21:50:25.344262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.344209 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="storage-initializer" Apr 24 21:50:25.344262 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.344215 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="storage-initializer" Apr 24 21:50:25.344390 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.344273 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="ab72925a-e8f3-441b-bcd4-b38da39f5a11" containerName="main" Apr 24 21:50:25.346238 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.346222 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.361919 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.361890 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8"] Apr 24 21:50:25.455621 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455591 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-dshm\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.455621 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455630 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-home\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.455828 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455650 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kserve-provision-location\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.455828 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455670 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-model-cache\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.455828 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455718 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tmp-dir\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.455828 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455782 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9jsxw\" (UniqueName: \"kubernetes.io/projected/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kube-api-access-9jsxw\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.455828 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.455810 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tls-certs\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.556817 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.556782 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-dshm\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.556817 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.556821 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-home\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557086 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.556840 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kserve-provision-location\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557086 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.556859 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-model-cache\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557086 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.556916 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tmp-dir\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557086 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.556965 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9jsxw\" (UniqueName: \"kubernetes.io/projected/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kube-api-access-9jsxw\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557086 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.557005 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tls-certs\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557336 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.557313 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-home\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557394 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.557337 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kserve-provision-location\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557466 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.557437 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-model-cache\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.557680 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.557660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tmp-dir\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.559161 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.559138 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-dshm\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.559546 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.559528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tls-certs\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.566209 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.566176 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9jsxw\" (UniqueName: \"kubernetes.io/projected/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kube-api-access-9jsxw\") pod \"stop-feature-test-kserve-8446b959c6-jb8f8\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.656634 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.656596 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:25.795515 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:25.795491 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8"] Apr 24 21:50:25.798539 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:50:25.798508 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb8fc3ad_aa42_40d3_bad8_1439d59562f3.slice/crio-85ca60edb706eb1e6862fecb5103daeb6b7e3873d807f31b28abcd313dcedcc1 WatchSource:0}: Error finding container 85ca60edb706eb1e6862fecb5103daeb6b7e3873d807f31b28abcd313dcedcc1: Status 404 returned error can't find the container with id 85ca60edb706eb1e6862fecb5103daeb6b7e3873d807f31b28abcd313dcedcc1 Apr 24 21:50:26.361763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:26.361724 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" event={"ID":"fb8fc3ad-aa42-40d3-bad8-1439d59562f3","Type":"ContainerStarted","Data":"9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e"} Apr 24 21:50:26.361763 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:26.361763 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" event={"ID":"fb8fc3ad-aa42-40d3-bad8-1439d59562f3","Type":"ContainerStarted","Data":"85ca60edb706eb1e6862fecb5103daeb6b7e3873d807f31b28abcd313dcedcc1"} Apr 24 21:50:31.129427 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:31.129377 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:50:31.380961 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:31.380880 2580 generic.go:358] "Generic (PLEG): container finished" podID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerID="9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e" exitCode=0 Apr 24 21:50:31.380961 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:31.380933 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" event={"ID":"fb8fc3ad-aa42-40d3-bad8-1439d59562f3","Type":"ContainerDied","Data":"9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e"} Apr 24 21:50:32.385907 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:32.385870 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" event={"ID":"fb8fc3ad-aa42-40d3-bad8-1439d59562f3","Type":"ContainerStarted","Data":"f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949"} Apr 24 21:50:35.657250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:35.657209 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:35.657250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:35.657258 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:50:35.658954 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:35.658914 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:50:39.599060 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.599036 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-8446b959c6-fxpwp_7595a027-c4cd-4e80-adfc-5b9512847709/main/0.log" Apr 24 21:50:39.599465 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.599446 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:50:39.624718 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.624650 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podStartSLOduration=14.62463075 podStartE2EDuration="14.62463075s" podCreationTimestamp="2026-04-24 21:50:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:50:32.417557518 +0000 UTC m=+1384.215620500" watchObservedRunningTime="2026-04-24 21:50:39.62463075 +0000 UTC m=+1391.422693720" Apr 24 21:50:39.679454 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679422 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rdfmq\" (UniqueName: \"kubernetes.io/projected/7595a027-c4cd-4e80-adfc-5b9512847709-kube-api-access-rdfmq\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.679640 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679471 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-home\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.679640 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679490 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-kserve-provision-location\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.679640 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679507 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-dshm\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.679640 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679540 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-model-cache\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.679640 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679574 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-tmp-dir\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.679640 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.679629 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7595a027-c4cd-4e80-adfc-5b9512847709-tls-certs\") pod \"7595a027-c4cd-4e80-adfc-5b9512847709\" (UID: \"7595a027-c4cd-4e80-adfc-5b9512847709\") " Apr 24 21:50:39.680092 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.680032 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-model-cache" (OuterVolumeSpecName: "model-cache") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:39.680356 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.680330 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-home" (OuterVolumeSpecName: "home") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:39.682424 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.682389 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7595a027-c4cd-4e80-adfc-5b9512847709-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:50:39.682424 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.682402 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7595a027-c4cd-4e80-adfc-5b9512847709-kube-api-access-rdfmq" (OuterVolumeSpecName: "kube-api-access-rdfmq") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "kube-api-access-rdfmq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:50:39.682424 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.682410 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-dshm" (OuterVolumeSpecName: "dshm") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:39.697575 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.697544 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:39.753380 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.753337 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7595a027-c4cd-4e80-adfc-5b9512847709" (UID: "7595a027-c4cd-4e80-adfc-5b9512847709"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:50:39.780710 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780680 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7595a027-c4cd-4e80-adfc-5b9512847709-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:39.780710 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780708 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rdfmq\" (UniqueName: \"kubernetes.io/projected/7595a027-c4cd-4e80-adfc-5b9512847709-kube-api-access-rdfmq\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:39.780885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780718 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:39.780885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780728 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:39.780885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780736 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:39.780885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780745 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:39.780885 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:39.780753 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/7595a027-c4cd-4e80-adfc-5b9512847709-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:50:40.417284 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.417253 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-8446b959c6-fxpwp_7595a027-c4cd-4e80-adfc-5b9512847709/main/0.log" Apr 24 21:50:40.417632 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.417606 2580 generic.go:358] "Generic (PLEG): container finished" podID="7595a027-c4cd-4e80-adfc-5b9512847709" containerID="9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3" exitCode=137 Apr 24 21:50:40.417784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.417640 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" event={"ID":"7595a027-c4cd-4e80-adfc-5b9512847709","Type":"ContainerDied","Data":"9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3"} Apr 24 21:50:40.417784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.417661 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" event={"ID":"7595a027-c4cd-4e80-adfc-5b9512847709","Type":"ContainerDied","Data":"6b422844f551823e2240d40b0fbf0be22b9dd4a6331b6e648a443b986cb92f1b"} Apr 24 21:50:40.417784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.417679 2580 scope.go:117] "RemoveContainer" containerID="9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3" Apr 24 21:50:40.417784 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.417682 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp" Apr 24 21:50:40.427714 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.427692 2580 scope.go:117] "RemoveContainer" containerID="fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4" Apr 24 21:50:40.439987 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.439950 2580 scope.go:117] "RemoveContainer" containerID="9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3" Apr 24 21:50:40.440364 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:50:40.440341 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3\": container with ID starting with 9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3 not found: ID does not exist" containerID="9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3" Apr 24 21:50:40.440466 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.440372 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3"} err="failed to get container status \"9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3\": rpc error: code = NotFound desc = could not find container \"9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3\": container with ID starting with 9d82a25464981db07e782546d82044bd8fbee7ce111ccf30b7da3deadac60ce3 not found: ID does not exist" Apr 24 21:50:40.440466 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.440389 2580 scope.go:117] "RemoveContainer" containerID="fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4" Apr 24 21:50:40.440742 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:50:40.440691 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4\": container with ID starting with fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4 not found: ID does not exist" containerID="fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4" Apr 24 21:50:40.440742 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.440727 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4"} err="failed to get container status \"fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4\": rpc error: code = NotFound desc = could not find container \"fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4\": container with ID starting with fe841de03f027947420cdee7e44f3b31cd873910ff13db94fda0c5b747c2ffc4 not found: ID does not exist" Apr 24 21:50:40.442587 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.442565 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp"] Apr 24 21:50:40.444557 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.444531 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-fxpwp"] Apr 24 21:50:40.688420 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:40.688331 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" path="/var/lib/kubelet/pods/7595a027-c4cd-4e80-adfc-5b9512847709/volumes" Apr 24 21:50:41.130367 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:41.130319 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:50:45.658071 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:45.658030 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:50:51.129812 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:51.129766 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:50:55.657627 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:50:55.657568 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:51:01.130079 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:01.130036 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:51:05.657527 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:05.657481 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:51:11.129843 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:11.129790 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:51:15.657406 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:15.657303 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:51:21.130315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:21.130265 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:51:25.658034 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:25.657970 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:51:31.130249 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:31.130198 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:51:35.658116 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:35.658062 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:51:41.129641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:41.129594 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:51:45.657701 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:45.657661 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:51:51.130447 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:51.130387 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" probeResult="failure" output="Get \"https://10.132.0.35:8000/health\": dial tcp 10.132.0.35:8000: connect: connection refused" Apr 24 21:51:55.658091 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:51:55.658033 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:01.139129 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:01.139086 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:52:01.147072 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:01.147040 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:52:05.657806 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:05.657761 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:10.264108 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:10.264075 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg"] Apr 24 21:52:10.264564 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:10.264436 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" containerID="cri-o://46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7" gracePeriod=30 Apr 24 21:52:15.657858 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:15.657812 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" probeResult="failure" output="Get \"https://10.132.0.36:8000/health\": dial tcp 10.132.0.36:8000: connect: connection refused" Apr 24 21:52:17.563788 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.563752 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8"] Apr 24 21:52:17.564188 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.564085 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="storage-initializer" Apr 24 21:52:17.564188 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.564097 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="storage-initializer" Apr 24 21:52:17.564188 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.564110 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" Apr 24 21:52:17.564188 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.564115 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" Apr 24 21:52:17.564188 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.564161 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="7595a027-c4cd-4e80-adfc-5b9512847709" containerName="main" Apr 24 21:52:17.569214 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.569188 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.572580 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.572553 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 24 21:52:17.573196 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.572967 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-9mkgd\"" Apr 24 21:52:17.576499 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.576468 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm"] Apr 24 21:52:17.581363 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.581338 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.583600 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.583572 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8"] Apr 24 21:52:17.598642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.598613 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm"] Apr 24 21:52:17.633180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633146 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.633180 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633183 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633206 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t2jz\" (UniqueName: \"kubernetes.io/projected/de640863-17a1-45ba-bba0-ed8998e7bc74-kube-api-access-8t2jz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.633414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633225 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9mt9t\" (UniqueName: \"kubernetes.io/projected/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kube-api-access-9mt9t\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633286 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633337 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633378 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.633414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633403 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de640863-17a1-45ba-bba0-ed8998e7bc74-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.633686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633422 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633452 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.633686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633487 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.633686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633504 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633576 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.633686 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.633624 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.733974 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.733942 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734186 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.733981 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734186 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734026 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t2jz\" (UniqueName: \"kubernetes.io/projected/de640863-17a1-45ba-bba0-ed8998e7bc74-kube-api-access-8t2jz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734321 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734292 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9mt9t\" (UniqueName: \"kubernetes.io/projected/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kube-api-access-9mt9t\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734375 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734357 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734435 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734417 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734492 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734470 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734492 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734472 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734495 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734498 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de640863-17a1-45ba-bba0-ed8998e7bc74-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734551 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734598 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734581 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734797 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734635 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.734797 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734659 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734797 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734709 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.734797 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734752 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.735017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734811 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.735017 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.734974 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.735131 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.735044 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.735131 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.735086 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.735350 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.735317 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.735574 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.735547 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.736904 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.736885 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.737137 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.737117 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de640863-17a1-45ba-bba0-ed8998e7bc74-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.737250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.737233 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.737499 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.737476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.742877 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.742851 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t2jz\" (UniqueName: \"kubernetes.io/projected/de640863-17a1-45ba-bba0-ed8998e7bc74-kube-api-access-8t2jz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.742970 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.742951 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9mt9t\" (UniqueName: \"kubernetes.io/projected/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kube-api-access-9mt9t\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:17.881068 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.881027 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:17.893958 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:17.893905 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:52:18.035399 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:18.035354 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8"] Apr 24 21:52:18.038758 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:52:18.038729 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde640863_17a1_45ba_bba0_ed8998e7bc74.slice/crio-aad9b218cc3fad1458e7303967157868e91950716965e13c0a0a259704c1810c WatchSource:0}: Error finding container aad9b218cc3fad1458e7303967157868e91950716965e13c0a0a259704c1810c: Status 404 returned error can't find the container with id aad9b218cc3fad1458e7303967157868e91950716965e13c0a0a259704c1810c Apr 24 21:52:18.040812 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:18.040788 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:52:18.049814 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:18.049689 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm"] Apr 24 21:52:18.052570 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:52:18.052547 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3c0e89ca_8bc8_4ba7_9628_64cad681f832.slice/crio-7ff98efbcee595240de56f69d9251dac64e3c6ed5fa89a3761c139161aa23a34 WatchSource:0}: Error finding container 7ff98efbcee595240de56f69d9251dac64e3c6ed5fa89a3761c139161aa23a34: Status 404 returned error can't find the container with id 7ff98efbcee595240de56f69d9251dac64e3c6ed5fa89a3761c139161aa23a34 Apr 24 21:52:18.749487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:18.749403 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" event={"ID":"3c0e89ca-8bc8-4ba7-9628-64cad681f832","Type":"ContainerStarted","Data":"af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce"} Apr 24 21:52:18.749487 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:18.749456 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" event={"ID":"3c0e89ca-8bc8-4ba7-9628-64cad681f832","Type":"ContainerStarted","Data":"7ff98efbcee595240de56f69d9251dac64e3c6ed5fa89a3761c139161aa23a34"} Apr 24 21:52:18.750725 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:18.750697 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerStarted","Data":"aad9b218cc3fad1458e7303967157868e91950716965e13c0a0a259704c1810c"} Apr 24 21:52:19.755747 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:19.755697 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerStarted","Data":"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2"} Apr 24 21:52:19.756246 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:19.755859 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:20.761171 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:20.761127 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerStarted","Data":"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd"} Apr 24 21:52:25.667719 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:25.667686 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:52:25.675785 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:25.675762 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:52:26.815797 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:26.815742 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8"] Apr 24 21:52:26.816454 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:26.816012 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" containerID="cri-o://f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949" gracePeriod=30 Apr 24 21:52:28.707319 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:28.707291 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:52:28.710463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:28.710438 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:52:28.711151 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:28.711130 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:52:28.714458 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:28.714439 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:52:31.777532 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:31.777495 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:52:40.516068 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.516042 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5d74d49756-cw8kg_742a75c5-0802-4047-8b47-9600e7751e1e/main/0.log" Apr 24 21:52:40.516441 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.516421 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:52:40.631836 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.631805 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-dshm\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.631836 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.631851 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qq97b\" (UniqueName: \"kubernetes.io/projected/742a75c5-0802-4047-8b47-9600e7751e1e-kube-api-access-qq97b\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.632132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.631873 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-kserve-provision-location\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.632132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.631901 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/742a75c5-0802-4047-8b47-9600e7751e1e-tls-certs\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.632132 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.632085 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-home\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.632304 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.632134 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-tmp-dir\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.632304 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.632186 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-model-cache\") pod \"742a75c5-0802-4047-8b47-9600e7751e1e\" (UID: \"742a75c5-0802-4047-8b47-9600e7751e1e\") " Apr 24 21:52:40.632665 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.632639 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-model-cache" (OuterVolumeSpecName: "model-cache") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.633095 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.633068 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-home" (OuterVolumeSpecName: "home") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.634165 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.634140 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-dshm" (OuterVolumeSpecName: "dshm") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.634264 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.634203 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/742a75c5-0802-4047-8b47-9600e7751e1e-kube-api-access-qq97b" (OuterVolumeSpecName: "kube-api-access-qq97b") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "kube-api-access-qq97b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:40.634504 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.634489 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/742a75c5-0802-4047-8b47-9600e7751e1e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:40.644625 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.644583 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.695377 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.695321 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "742a75c5-0802-4047-8b47-9600e7751e1e" (UID: "742a75c5-0802-4047-8b47-9600e7751e1e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:40.733187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733148 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.733187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733182 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qq97b\" (UniqueName: \"kubernetes.io/projected/742a75c5-0802-4047-8b47-9600e7751e1e-kube-api-access-qq97b\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.733187 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733194 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.733410 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733205 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/742a75c5-0802-4047-8b47-9600e7751e1e-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.733410 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733214 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.733410 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733221 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.733410 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.733229 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/742a75c5-0802-4047-8b47-9600e7751e1e-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:40.835259 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.835225 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-5d74d49756-cw8kg_742a75c5-0802-4047-8b47-9600e7751e1e/main/0.log" Apr 24 21:52:40.835629 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.835601 2580 generic.go:358] "Generic (PLEG): container finished" podID="742a75c5-0802-4047-8b47-9600e7751e1e" containerID="46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7" exitCode=137 Apr 24 21:52:40.835688 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.835674 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" Apr 24 21:52:40.835724 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.835694 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" event={"ID":"742a75c5-0802-4047-8b47-9600e7751e1e","Type":"ContainerDied","Data":"46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7"} Apr 24 21:52:40.835803 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.835737 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg" event={"ID":"742a75c5-0802-4047-8b47-9600e7751e1e","Type":"ContainerDied","Data":"8c087b38dc380babe0eb98e8c02dc59128ccb54769a0009e2a1019dd55636f29"} Apr 24 21:52:40.835803 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.835755 2580 scope.go:117] "RemoveContainer" containerID="46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7" Apr 24 21:52:40.844922 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.844900 2580 scope.go:117] "RemoveContainer" containerID="76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017" Apr 24 21:52:40.858067 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.858039 2580 scope.go:117] "RemoveContainer" containerID="46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7" Apr 24 21:52:40.858504 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:52:40.858479 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7\": container with ID starting with 46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7 not found: ID does not exist" containerID="46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7" Apr 24 21:52:40.858613 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.858517 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7"} err="failed to get container status \"46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7\": rpc error: code = NotFound desc = could not find container \"46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7\": container with ID starting with 46007f0a272cd976e6f3e835b2b1c809cc4b05eda86fe831f769f7453911a6b7 not found: ID does not exist" Apr 24 21:52:40.858613 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.858543 2580 scope.go:117] "RemoveContainer" containerID="76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017" Apr 24 21:52:40.858912 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:52:40.858872 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017\": container with ID starting with 76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017 not found: ID does not exist" containerID="76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017" Apr 24 21:52:40.859019 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.858918 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017"} err="failed to get container status \"76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017\": rpc error: code = NotFound desc = could not find container \"76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017\": container with ID starting with 76014d94a41435d4027d1af4c06f0de7bb37cfde4d683f5fe8d8e96b8ac86017 not found: ID does not exist" Apr 24 21:52:40.860492 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.860457 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg"] Apr 24 21:52:40.863695 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:40.863671 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-5d74d49756-cw8kg"] Apr 24 21:52:42.692791 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:42.692754 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" path="/var/lib/kubelet/pods/742a75c5-0802-4047-8b47-9600e7751e1e/volumes" Apr 24 21:52:57.069045 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.069021 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-8446b959c6-jb8f8_fb8fc3ad-aa42-40d3-bad8-1439d59562f3/main/0.log" Apr 24 21:52:57.069441 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.069354 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:52:57.178143 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178110 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tmp-dir\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178294 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178150 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-dshm\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178294 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178187 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-model-cache\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178294 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178222 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tls-certs\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178294 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178256 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kserve-provision-location\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178493 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178300 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9jsxw\" (UniqueName: \"kubernetes.io/projected/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kube-api-access-9jsxw\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178493 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178358 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-home\") pod \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\" (UID: \"fb8fc3ad-aa42-40d3-bad8-1439d59562f3\") " Apr 24 21:52:57.178569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178494 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-model-cache" (OuterVolumeSpecName: "model-cache") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:57.178641 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.178624 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.179418 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.179391 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-home" (OuterVolumeSpecName: "home") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:57.180860 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.180835 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-dshm" (OuterVolumeSpecName: "dshm") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:57.180961 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.180848 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:52:57.181037 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.181018 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kube-api-access-9jsxw" (OuterVolumeSpecName: "kube-api-access-9jsxw") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "kube-api-access-9jsxw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:52:57.190593 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.190569 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:57.239076 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.239049 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb8fc3ad-aa42-40d3-bad8-1439d59562f3" (UID: "fb8fc3ad-aa42-40d3-bad8-1439d59562f3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:52:57.279462 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.279439 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9jsxw\" (UniqueName: \"kubernetes.io/projected/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kube-api-access-9jsxw\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.279462 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.279462 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.279565 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.279472 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.279565 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.279480 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.279565 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.279488 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.279565 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.279497 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb8fc3ad-aa42-40d3-bad8-1439d59562f3-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:52:57.892540 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.892513 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-8446b959c6-jb8f8_fb8fc3ad-aa42-40d3-bad8-1439d59562f3/main/0.log" Apr 24 21:52:57.892853 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.892830 2580 generic.go:358] "Generic (PLEG): container finished" podID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerID="f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949" exitCode=137 Apr 24 21:52:57.892928 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.892879 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" event={"ID":"fb8fc3ad-aa42-40d3-bad8-1439d59562f3","Type":"ContainerDied","Data":"f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949"} Apr 24 21:52:57.892928 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.892907 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" event={"ID":"fb8fc3ad-aa42-40d3-bad8-1439d59562f3","Type":"ContainerDied","Data":"85ca60edb706eb1e6862fecb5103daeb6b7e3873d807f31b28abcd313dcedcc1"} Apr 24 21:52:57.893029 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.892927 2580 scope.go:117] "RemoveContainer" containerID="f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949" Apr 24 21:52:57.893029 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.892944 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8" Apr 24 21:52:57.901184 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.901166 2580 scope.go:117] "RemoveContainer" containerID="9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e" Apr 24 21:52:57.915425 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.915406 2580 scope.go:117] "RemoveContainer" containerID="f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949" Apr 24 21:52:57.915765 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:52:57.915730 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949\": container with ID starting with f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949 not found: ID does not exist" containerID="f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949" Apr 24 21:52:57.915860 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.915783 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949"} err="failed to get container status \"f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949\": rpc error: code = NotFound desc = could not find container \"f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949\": container with ID starting with f2e6400541e4d6324b78a7b62b8fc8c1d3d50bb05360cf624112a7cc3ffea949 not found: ID does not exist" Apr 24 21:52:57.915860 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.915811 2580 scope.go:117] "RemoveContainer" containerID="9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e" Apr 24 21:52:57.916209 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:52:57.916173 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e\": container with ID starting with 9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e not found: ID does not exist" containerID="9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e" Apr 24 21:52:57.916316 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.916210 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e"} err="failed to get container status \"9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e\": rpc error: code = NotFound desc = could not find container \"9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e\": container with ID starting with 9350e9a687b53ab2a8cd97dd55979823e2783ee7137a70072b0dcd96d194bf5e not found: ID does not exist" Apr 24 21:52:57.917237 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.917209 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8"] Apr 24 21:52:57.920986 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:57.920967 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-8446b959c6-jb8f8"] Apr 24 21:52:58.688610 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:52:58.688578 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" path="/var/lib/kubelet/pods/fb8fc3ad-aa42-40d3-bad8-1439d59562f3/volumes" Apr 24 21:53:43.044914 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:43.044880 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerID="af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce" exitCode=0 Apr 24 21:53:43.045322 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:43.044935 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" event={"ID":"3c0e89ca-8bc8-4ba7-9628-64cad681f832","Type":"ContainerDied","Data":"af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce"} Apr 24 21:53:44.050024 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:44.049968 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" event={"ID":"3c0e89ca-8bc8-4ba7-9628-64cad681f832","Type":"ContainerStarted","Data":"948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4"} Apr 24 21:53:44.075585 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:44.075526 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podStartSLOduration=87.075511834 podStartE2EDuration="1m27.075511834s" podCreationTimestamp="2026-04-24 21:52:17 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:53:44.073828901 +0000 UTC m=+1575.871891892" watchObservedRunningTime="2026-04-24 21:53:44.075511834 +0000 UTC m=+1575.873574803" Apr 24 21:53:47.894296 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:47.894260 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:53:47.894676 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:47.894311 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:53:47.895808 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:47.895773 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:53:51.077414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.077376 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj"] Apr 24 21:53:51.078351 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078323 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="storage-initializer" Apr 24 21:53:51.078351 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078352 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="storage-initializer" Apr 24 21:53:51.078512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078366 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" Apr 24 21:53:51.078512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078375 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" Apr 24 21:53:51.078512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078422 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" Apr 24 21:53:51.078512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078431 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" Apr 24 21:53:51.078512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078458 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="storage-initializer" Apr 24 21:53:51.078512 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078466 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="storage-initializer" Apr 24 21:53:51.078719 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078556 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb8fc3ad-aa42-40d3-bad8-1439d59562f3" containerName="main" Apr 24 21:53:51.078719 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.078569 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="742a75c5-0802-4047-8b47-9600e7751e1e" containerName="main" Apr 24 21:53:51.113945 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.113908 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj"] Apr 24 21:53:51.114088 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.114073 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.116116 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.116097 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 24 21:53:51.152468 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152442 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.152599 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152480 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.152599 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152506 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.152599 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152548 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qf9ck\" (UniqueName: \"kubernetes.io/projected/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kube-api-access-qf9ck\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.152599 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152589 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.152759 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152622 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.152759 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.152658 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.253629 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253603 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.253794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253646 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.253794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253675 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.253794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253704 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.253794 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253726 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qf9ck\" (UniqueName: \"kubernetes.io/projected/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kube-api-access-qf9ck\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.254133 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253866 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.254133 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.253925 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.254236 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.254207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.254288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.254250 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.254288 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.254262 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.254462 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.254441 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.255952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.255932 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.256325 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.256304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.263080 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.263058 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qf9ck\" (UniqueName: \"kubernetes.io/projected/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kube-api-access-qf9ck\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.424369 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.424332 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:53:51.560366 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:51.560333 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj"] Apr 24 21:53:51.563446 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:53:51.563415 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod38b619a1_d7c9_47a6_8014_5cf88416ffa9.slice/crio-f3cd2d6f9cb5c8a343418ac9a9f4e93f643abf1d1d7ae49e79455cf59e1962bb WatchSource:0}: Error finding container f3cd2d6f9cb5c8a343418ac9a9f4e93f643abf1d1d7ae49e79455cf59e1962bb: Status 404 returned error can't find the container with id f3cd2d6f9cb5c8a343418ac9a9f4e93f643abf1d1d7ae49e79455cf59e1962bb Apr 24 21:53:52.090098 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:52.089988 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" event={"ID":"38b619a1-d7c9-47a6-8014-5cf88416ffa9","Type":"ContainerStarted","Data":"782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f"} Apr 24 21:53:52.090098 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:52.090056 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" event={"ID":"38b619a1-d7c9-47a6-8014-5cf88416ffa9","Type":"ContainerStarted","Data":"f3cd2d6f9cb5c8a343418ac9a9f4e93f643abf1d1d7ae49e79455cf59e1962bb"} Apr 24 21:53:57.895221 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:53:57.895183 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:54:02.126359 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:02.126314 2580 generic.go:358] "Generic (PLEG): container finished" podID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerID="cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd" exitCode=0 Apr 24 21:54:02.126829 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:02.126388 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerDied","Data":"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd"} Apr 24 21:54:03.132852 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:03.132816 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerStarted","Data":"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be"} Apr 24 21:54:03.162760 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:03.162694 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podStartSLOduration=105.365971066 podStartE2EDuration="1m46.162672257s" podCreationTimestamp="2026-04-24 21:52:17 +0000 UTC" firstStartedPulling="2026-04-24 21:52:18.040983434 +0000 UTC m=+1489.839046396" lastFinishedPulling="2026-04-24 21:52:18.837684624 +0000 UTC m=+1490.635747587" observedRunningTime="2026-04-24 21:54:03.161648125 +0000 UTC m=+1594.959711096" watchObservedRunningTime="2026-04-24 21:54:03.162672257 +0000 UTC m=+1594.960735228" Apr 24 21:54:07.881870 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:07.881822 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:54:07.881870 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:07.881874 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:54:07.882547 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:07.882198 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:54:07.894690 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:07.894650 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:54:17.881838 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:17.881725 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:54:17.894777 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:17.894718 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:54:27.882578 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:27.882528 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:54:27.895203 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:27.895159 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:54:37.882160 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:37.882102 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:54:37.895098 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:37.895065 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:54:47.882094 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:47.882039 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:54:47.894450 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:47.894404 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:54:57.881657 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:57.881604 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:54:57.894342 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:54:57.894302 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:55:07.882073 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:07.882022 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:55:07.895116 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:07.895071 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:55:17.882268 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:17.882212 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:55:17.894280 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:17.894241 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:55:27.882382 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:27.882327 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:55:27.894778 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:27.894739 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:55:37.882523 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:37.882470 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:55:37.894888 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:37.894850 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" probeResult="failure" output="Get \"https://10.132.0.38:8000/health\": dial tcp 10.132.0.38:8000: connect: connection refused" Apr 24 21:55:42.493781 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:42.493745 2580 generic.go:358] "Generic (PLEG): container finished" podID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerID="782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f" exitCode=0 Apr 24 21:55:42.494164 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:42.493831 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" event={"ID":"38b619a1-d7c9-47a6-8014-5cf88416ffa9","Type":"ContainerDied","Data":"782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f"} Apr 24 21:55:43.498726 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:43.498688 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" event={"ID":"38b619a1-d7c9-47a6-8014-5cf88416ffa9","Type":"ContainerStarted","Data":"fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4"} Apr 24 21:55:43.531009 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:43.530914 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podStartSLOduration=112.530896264 podStartE2EDuration="1m52.530896264s" podCreationTimestamp="2026-04-24 21:53:51 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:55:43.527553876 +0000 UTC m=+1695.325616857" watchObservedRunningTime="2026-04-24 21:55:43.530896264 +0000 UTC m=+1695.328959236" Apr 24 21:55:47.881699 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:47.881586 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:55:47.905167 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:47.905131 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:55:47.916161 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:47.916126 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:55:51.425149 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:51.425102 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:55:51.425149 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:51.425149 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:55:51.426708 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:51.426674 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:55:57.881746 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:55:57.881701 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" probeResult="failure" output="Get \"https://10.132.0.37:8001/health\": dial tcp 10.132.0.37:8001: connect: connection refused" Apr 24 21:56:01.425373 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:01.425321 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:56:07.897010 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:07.896961 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:56:07.910068 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:07.910038 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:56:11.425338 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:11.425291 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:56:21.124690 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:21.124643 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8"] Apr 24 21:56:21.125385 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:21.125326 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" containerID="cri-o://15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be" gracePeriod=30 Apr 24 21:56:21.126709 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:21.126678 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm"] Apr 24 21:56:21.127062 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:21.127016 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" containerID="cri-o://948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4" gracePeriod=30 Apr 24 21:56:21.425712 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:21.425617 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:56:30.630595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.630560 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg"] Apr 24 21:56:30.634486 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.634462 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.636853 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.636826 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-vdnlh\"" Apr 24 21:56:30.636974 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.636905 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 24 21:56:30.638989 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.638966 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-dshm\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.639102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.639061 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-home\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.639102 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.639084 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tc8cn\" (UniqueName: \"kubernetes.io/projected/14aed0c5-69c3-4390-902b-ef799b0f9c16-kube-api-access-tc8cn\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.639212 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.639179 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-model-cache\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.639212 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.639197 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14aed0c5-69c3-4390-902b-ef799b0f9c16-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.639328 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.639214 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.639328 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.639245 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.649746 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.649714 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg"] Apr 24 21:56:30.655362 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.655334 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth"] Apr 24 21:56:30.659588 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.659566 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.673590 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.673563 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth"] Apr 24 21:56:30.739970 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.739915 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-dshm\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.739983 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740043 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j2vbq\" (UniqueName: \"kubernetes.io/projected/677c14b9-087d-4b41-839f-cc6a7d126def-kube-api-access-j2vbq\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740092 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-home\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740250 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740157 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tc8cn\" (UniqueName: \"kubernetes.io/projected/14aed0c5-69c3-4390-902b-ef799b0f9c16-kube-api-access-tc8cn\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740249 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740287 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740326 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740352 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/677c14b9-087d-4b41-839f-cc6a7d126def-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740408 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-model-cache\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740442 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14aed0c5-69c3-4390-902b-ef799b0f9c16-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740501 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740479 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740943 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740514 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.740943 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740550 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740943 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740717 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-home\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.740943 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740886 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.741208 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.740956 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-model-cache\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.741297 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.741271 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.743636 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.743476 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-dshm\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.744232 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.744207 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14aed0c5-69c3-4390-902b-ef799b0f9c16-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.750702 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.750667 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tc8cn\" (UniqueName: \"kubernetes.io/projected/14aed0c5-69c3-4390-902b-ef799b0f9c16-kube-api-access-tc8cn\") pod \"custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.841817 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.841773 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842021 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.841829 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j2vbq\" (UniqueName: \"kubernetes.io/projected/677c14b9-087d-4b41-839f-cc6a7d126def-kube-api-access-j2vbq\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842021 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.841910 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842021 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.841947 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842021 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.841984 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842233 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.842129 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/677c14b9-087d-4b41-839f-cc6a7d126def-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842233 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.842215 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842340 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.842216 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842340 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.842291 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842443 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.842353 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.842536 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.842515 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.844252 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.844223 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.844545 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.844528 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/677c14b9-087d-4b41-839f-cc6a7d126def-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.851259 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.851232 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j2vbq\" (UniqueName: \"kubernetes.io/projected/677c14b9-087d-4b41-839f-cc6a7d126def-kube-api-access-j2vbq\") pod \"custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:30.943984 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.943886 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:30.973899 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:30.973842 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:56:31.101975 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.101947 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg"] Apr 24 21:56:31.125460 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.125369 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth"] Apr 24 21:56:31.128070 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:56:31.128043 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod677c14b9_087d_4b41_839f_cc6a7d126def.slice/crio-f92ddbfa1ea382e8084aad9dc2d4b4118624e67c6f4d81b4eee1e9d1ef082ec5 WatchSource:0}: Error finding container f92ddbfa1ea382e8084aad9dc2d4b4118624e67c6f4d81b4eee1e9d1ef082ec5: Status 404 returned error can't find the container with id f92ddbfa1ea382e8084aad9dc2d4b4118624e67c6f4d81b4eee1e9d1ef082ec5 Apr 24 21:56:31.425684 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.425633 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:56:31.673429 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.673395 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" event={"ID":"677c14b9-087d-4b41-839f-cc6a7d126def","Type":"ContainerStarted","Data":"bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9"} Apr 24 21:56:31.673429 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.673433 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" event={"ID":"677c14b9-087d-4b41-839f-cc6a7d126def","Type":"ContainerStarted","Data":"f92ddbfa1ea382e8084aad9dc2d4b4118624e67c6f4d81b4eee1e9d1ef082ec5"} Apr 24 21:56:31.674665 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.674625 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" event={"ID":"14aed0c5-69c3-4390-902b-ef799b0f9c16","Type":"ContainerStarted","Data":"5ff0f6c9ab6f52c33dafb9f515ed9001b67fe36d40044d0e97362599861d640d"} Apr 24 21:56:31.674665 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.674655 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" event={"ID":"14aed0c5-69c3-4390-902b-ef799b0f9c16","Type":"ContainerStarted","Data":"53b7c21e404844a9e17e21539b2b57db985a7af790b0374b3fd5cc8c35a90120"} Apr 24 21:56:31.674867 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:31.674752 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:32.693284 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:32.693239 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" event={"ID":"14aed0c5-69c3-4390-902b-ef799b0f9c16","Type":"ContainerStarted","Data":"5f21e0ee72f529af0f44a3a684ead0a0f9b1ba546b0e95e57ec361c61e621771"} Apr 24 21:56:41.425637 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:41.425569 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:56:43.703417 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:43.703384 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 21:56:51.126131 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.126054 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="llm-d-routing-sidecar" containerID="cri-o://4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2" gracePeriod=2 Apr 24 21:56:51.425584 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.425498 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:56:51.582891 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.582870 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:56:51.585836 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.585817 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8_de640863-17a1-45ba-bba0-ed8998e7bc74/main/0.log" Apr 24 21:56:51.586413 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.586396 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:56:51.743645 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743547 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-kserve-provision-location\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.743645 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743597 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tls-certs\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.743645 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743639 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tmp-dir\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743678 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-9mt9t\" (UniqueName: \"kubernetes.io/projected/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kube-api-access-9mt9t\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743699 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-dshm\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743726 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kserve-provision-location\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743747 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-tmp-dir\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743770 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-model-cache\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743793 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8t2jz\" (UniqueName: \"kubernetes.io/projected/de640863-17a1-45ba-bba0-ed8998e7bc74-kube-api-access-8t2jz\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743814 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-model-cache\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743886 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-home\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.743929 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743912 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de640863-17a1-45ba-bba0-ed8998e7bc74-tls-certs\") pod \"de640863-17a1-45ba-bba0-ed8998e7bc74\" (UID: \"de640863-17a1-45ba-bba0-ed8998e7bc74\") " Apr 24 21:56:51.744414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743938 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-home\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.744414 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.743983 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-dshm\") pod \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\" (UID: \"3c0e89ca-8bc8-4ba7-9628-64cad681f832\") " Apr 24 21:56:51.746589 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.746326 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-dshm" (OuterVolumeSpecName: "dshm") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.746589 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.746347 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-dshm" (OuterVolumeSpecName: "dshm") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.746773 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.746593 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-model-cache" (OuterVolumeSpecName: "model-cache") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.747639 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.747204 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kube-api-access-9mt9t" (OuterVolumeSpecName: "kube-api-access-9mt9t") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "kube-api-access-9mt9t". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:51.747639 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.747259 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-home" (OuterVolumeSpecName: "home") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.748061 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.747856 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-model-cache" (OuterVolumeSpecName: "model-cache") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.749550 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.748583 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-home" (OuterVolumeSpecName: "home") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.749825 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.749764 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/de640863-17a1-45ba-bba0-ed8998e7bc74-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:51.750091 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.749989 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:56:51.750393 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.750367 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/de640863-17a1-45ba-bba0-ed8998e7bc74-kube-api-access-8t2jz" (OuterVolumeSpecName: "kube-api-access-8t2jz") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "kube-api-access-8t2jz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:56:51.753415 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.753396 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8_de640863-17a1-45ba-bba0-ed8998e7bc74/main/0.log" Apr 24 21:56:51.754313 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754288 2580 generic.go:358] "Generic (PLEG): container finished" podID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerID="15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be" exitCode=137 Apr 24 21:56:51.754407 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754327 2580 generic.go:358] "Generic (PLEG): container finished" podID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerID="4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2" exitCode=0 Apr 24 21:56:51.754407 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754366 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerDied","Data":"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be"} Apr 24 21:56:51.754407 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754396 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerDied","Data":"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2"} Apr 24 21:56:51.754569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754411 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" event={"ID":"de640863-17a1-45ba-bba0-ed8998e7bc74","Type":"ContainerDied","Data":"aad9b218cc3fad1458e7303967157868e91950716965e13c0a0a259704c1810c"} Apr 24 21:56:51.754569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754411 2580 scope.go:117] "RemoveContainer" containerID="15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be" Apr 24 21:56:51.754569 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.754395 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8" Apr 24 21:56:51.756911 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.756736 2580 generic.go:358] "Generic (PLEG): container finished" podID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerID="948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4" exitCode=137 Apr 24 21:56:51.756911 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.756793 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" event={"ID":"3c0e89ca-8bc8-4ba7-9628-64cad681f832","Type":"ContainerDied","Data":"948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4"} Apr 24 21:56:51.756911 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.756817 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" event={"ID":"3c0e89ca-8bc8-4ba7-9628-64cad681f832","Type":"ContainerDied","Data":"7ff98efbcee595240de56f69d9251dac64e3c6ed5fa89a3761c139161aa23a34"} Apr 24 21:56:51.756911 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.756819 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm" Apr 24 21:56:51.761886 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.761855 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.762183 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.762150 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.764652 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.764632 2580 scope.go:117] "RemoveContainer" containerID="cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd" Apr 24 21:56:51.801144 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.801081 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "de640863-17a1-45ba-bba0-ed8998e7bc74" (UID: "de640863-17a1-45ba-bba0-ed8998e7bc74"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.809879 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.809832 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3c0e89ca-8bc8-4ba7-9628-64cad681f832" (UID: "3c0e89ca-8bc8-4ba7-9628-64cad681f832"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:56:51.816494 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.816465 2580 scope.go:117] "RemoveContainer" containerID="4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2" Apr 24 21:56:51.824223 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.824201 2580 scope.go:117] "RemoveContainer" containerID="15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be" Apr 24 21:56:51.824495 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:56:51.824469 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be\": container with ID starting with 15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be not found: ID does not exist" containerID="15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be" Apr 24 21:56:51.824590 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.824505 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be"} err="failed to get container status \"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be\": rpc error: code = NotFound desc = could not find container \"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be\": container with ID starting with 15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be not found: ID does not exist" Apr 24 21:56:51.824590 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.824526 2580 scope.go:117] "RemoveContainer" containerID="cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd" Apr 24 21:56:51.824809 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:56:51.824785 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd\": container with ID starting with cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd not found: ID does not exist" containerID="cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd" Apr 24 21:56:51.824879 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.824817 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd"} err="failed to get container status \"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd\": rpc error: code = NotFound desc = could not find container \"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd\": container with ID starting with cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd not found: ID does not exist" Apr 24 21:56:51.824879 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.824833 2580 scope.go:117] "RemoveContainer" containerID="4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2" Apr 24 21:56:51.825136 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:56:51.825116 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2\": container with ID starting with 4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2 not found: ID does not exist" containerID="4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2" Apr 24 21:56:51.825202 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825144 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2"} err="failed to get container status \"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2\": rpc error: code = NotFound desc = could not find container \"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2\": container with ID starting with 4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2 not found: ID does not exist" Apr 24 21:56:51.825202 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825156 2580 scope.go:117] "RemoveContainer" containerID="15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be" Apr 24 21:56:51.825375 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825357 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be"} err="failed to get container status \"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be\": rpc error: code = NotFound desc = could not find container \"15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be\": container with ID starting with 15cf0c7d3f16ac6395c7ebeac38b9f35b876a4248a6eaf81c21c0a048f8d76be not found: ID does not exist" Apr 24 21:56:51.825440 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825374 2580 scope.go:117] "RemoveContainer" containerID="cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd" Apr 24 21:56:51.825597 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825578 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd"} err="failed to get container status \"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd\": rpc error: code = NotFound desc = could not find container \"cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd\": container with ID starting with cd65936c6916abc034928258a0b432e9d028d433eb8e83c0f45e92a4a99446dd not found: ID does not exist" Apr 24 21:56:51.825597 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825596 2580 scope.go:117] "RemoveContainer" containerID="4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2" Apr 24 21:56:51.825801 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825777 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2"} err="failed to get container status \"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2\": rpc error: code = NotFound desc = could not find container \"4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2\": container with ID starting with 4680a9bce64260593f2a640c4b0f468ee0f42fed6b09b1e9c4d0d85d089713c2 not found: ID does not exist" Apr 24 21:56:51.825871 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.825806 2580 scope.go:117] "RemoveContainer" containerID="948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4" Apr 24 21:56:51.833118 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.833085 2580 scope.go:117] "RemoveContainer" containerID="af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce" Apr 24 21:56:51.845471 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845445 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845471 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845486 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845495 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-9mt9t\" (UniqueName: \"kubernetes.io/projected/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kube-api-access-9mt9t\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845504 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845513 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845521 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845530 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845538 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8t2jz\" (UniqueName: \"kubernetes.io/projected/de640863-17a1-45ba-bba0-ed8998e7bc74-kube-api-access-8t2jz\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845546 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845555 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/de640863-17a1-45ba-bba0-ed8998e7bc74-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845562 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/de640863-17a1-45ba-bba0-ed8998e7bc74-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845569 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.845595 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.845576 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3c0e89ca-8bc8-4ba7-9628-64cad681f832-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:56:51.887157 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.887130 2580 scope.go:117] "RemoveContainer" containerID="948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4" Apr 24 21:56:51.887523 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:56:51.887497 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4\": container with ID starting with 948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4 not found: ID does not exist" containerID="948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4" Apr 24 21:56:51.887577 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.887533 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4"} err="failed to get container status \"948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4\": rpc error: code = NotFound desc = could not find container \"948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4\": container with ID starting with 948747d57a632717917be98c2f6f16e237e5c8b663c388bcc5d5cd27465a7fd4 not found: ID does not exist" Apr 24 21:56:51.887577 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.887553 2580 scope.go:117] "RemoveContainer" containerID="af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce" Apr 24 21:56:51.887790 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:56:51.887771 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce\": container with ID starting with af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce not found: ID does not exist" containerID="af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce" Apr 24 21:56:51.887840 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:51.887799 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce"} err="failed to get container status \"af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce\": rpc error: code = NotFound desc = could not find container \"af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce\": container with ID starting with af34bfab50c1e15f4a0838aded3cb63ca0c4fc301e3b028e530095aa279f8fce not found: ID does not exist" Apr 24 21:56:52.080462 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:52.080428 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8"] Apr 24 21:56:52.085335 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:52.085302 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-7d7d5999b7f2px8"] Apr 24 21:56:52.097591 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:52.097562 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm"] Apr 24 21:56:52.101661 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:52.101634 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-75wljrm"] Apr 24 21:56:52.688901 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:52.688870 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" path="/var/lib/kubelet/pods/3c0e89ca-8bc8-4ba7-9628-64cad681f832/volumes" Apr 24 21:56:52.689337 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:56:52.689322 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" path="/var/lib/kubelet/pods/de640863-17a1-45ba-bba0-ed8998e7bc74/volumes" Apr 24 21:57:01.425463 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:01.425409 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:57:11.424756 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:11.424715 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" probeResult="failure" output="Get \"https://10.132.0.39:8000/health\": dial tcp 10.132.0.39:8000: connect: connection refused" Apr 24 21:57:21.434542 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:21.434505 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:57:21.442186 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:21.442158 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:57:26.433039 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:26.432952 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj"] Apr 24 21:57:26.433518 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:26.433340 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" containerID="cri-o://fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4" gracePeriod=30 Apr 24 21:57:28.731069 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:28.731031 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:57:28.735053 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:28.735030 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:57:28.735950 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:28.735932 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 21:57:28.740019 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:28.739986 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 21:57:34.904071 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:34.904032 2580 generic.go:358] "Generic (PLEG): container finished" podID="677c14b9-087d-4b41-839f-cc6a7d126def" containerID="bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9" exitCode=0 Apr 24 21:57:34.904449 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:34.904081 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" event={"ID":"677c14b9-087d-4b41-839f-cc6a7d126def","Type":"ContainerDied","Data":"bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9"} Apr 24 21:57:34.905166 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:34.905151 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 21:57:35.908816 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:35.908773 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" event={"ID":"677c14b9-087d-4b41-839f-cc6a7d126def","Type":"ContainerStarted","Data":"a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9"} Apr 24 21:57:35.932982 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:35.932914 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podStartSLOduration=65.932894674 podStartE2EDuration="1m5.932894674s" podCreationTimestamp="2026-04-24 21:56:30 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 21:57:35.932328086 +0000 UTC m=+1807.730391046" watchObservedRunningTime="2026-04-24 21:57:35.932894674 +0000 UTC m=+1807.730957645" Apr 24 21:57:40.974594 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:40.974560 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:57:40.974594 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:40.974605 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:57:40.975981 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:40.975952 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:57:50.975207 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:50.975160 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:57:56.813548 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.813518 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj_38b619a1-d7c9-47a6-8014-5cf88416ffa9/main/0.log" Apr 24 21:57:56.813978 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.813960 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:57:56.918535 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918493 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tmp-dir\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.918732 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918555 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-qf9ck\" (UniqueName: \"kubernetes.io/projected/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kube-api-access-qf9ck\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.918732 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918587 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-home\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.918732 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918626 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-model-cache\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.918732 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918682 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kserve-provision-location\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.918732 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918731 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-dshm\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.919038 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918772 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tls-certs\") pod \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\" (UID: \"38b619a1-d7c9-47a6-8014-5cf88416ffa9\") " Apr 24 21:57:56.919038 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.918978 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-model-cache" (OuterVolumeSpecName: "model-cache") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:56.919322 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.919286 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-home" (OuterVolumeSpecName: "home") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:56.921140 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.921119 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 21:57:56.921450 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.921419 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kube-api-access-qf9ck" (OuterVolumeSpecName: "kube-api-access-qf9ck") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "kube-api-access-qf9ck". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 21:57:56.921538 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.921503 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-dshm" (OuterVolumeSpecName: "dshm") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:56.940642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.940576 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:56.978660 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.978564 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "38b619a1-d7c9-47a6-8014-5cf88416ffa9" (UID: "38b619a1-d7c9-47a6-8014-5cf88416ffa9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 21:57:56.981071 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.981046 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj_38b619a1-d7c9-47a6-8014-5cf88416ffa9/main/0.log" Apr 24 21:57:56.981431 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.981408 2580 generic.go:358] "Generic (PLEG): container finished" podID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerID="fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4" exitCode=137 Apr 24 21:57:56.981507 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.981483 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" event={"ID":"38b619a1-d7c9-47a6-8014-5cf88416ffa9","Type":"ContainerDied","Data":"fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4"} Apr 24 21:57:56.981548 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.981528 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" event={"ID":"38b619a1-d7c9-47a6-8014-5cf88416ffa9","Type":"ContainerDied","Data":"f3cd2d6f9cb5c8a343418ac9a9f4e93f643abf1d1d7ae49e79455cf59e1962bb"} Apr 24 21:57:56.981548 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.981530 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj" Apr 24 21:57:56.981548 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.981545 2580 scope.go:117] "RemoveContainer" containerID="fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4" Apr 24 21:57:56.990344 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:56.990308 2580 scope.go:117] "RemoveContainer" containerID="782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f" Apr 24 21:57:57.011718 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.011684 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj"] Apr 24 21:57:57.013344 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.013319 2580 scope.go:117] "RemoveContainer" containerID="fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4" Apr 24 21:57:57.013698 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:57:57.013674 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4\": container with ID starting with fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4 not found: ID does not exist" containerID="fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4" Apr 24 21:57:57.013779 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.013708 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4"} err="failed to get container status \"fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4\": rpc error: code = NotFound desc = could not find container \"fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4\": container with ID starting with fb92c2b4555a8ab5a7a88b553073e706c95ba0d13d098e26ae698691e795d5b4 not found: ID does not exist" Apr 24 21:57:57.013779 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.013728 2580 scope.go:117] "RemoveContainer" containerID="782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f" Apr 24 21:57:57.013978 ip-10-0-133-36 kubenswrapper[2580]: E0424 21:57:57.013962 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f\": container with ID starting with 782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f not found: ID does not exist" containerID="782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f" Apr 24 21:57:57.014049 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.013982 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f"} err="failed to get container status \"782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f\": rpc error: code = NotFound desc = could not find container \"782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f\": container with ID starting with 782204d4bea912e2a44599b7db2659e95690fd66c6d3470d5843841eacd0af9f not found: ID does not exist" Apr 24 21:57:57.019315 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019288 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-9d67df544-mdbgj"] Apr 24 21:57:57.019727 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019705 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:57.019771 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019736 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:57.019771 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019753 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:57.019835 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019770 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:57.019835 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019786 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:57.019835 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019800 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-qf9ck\" (UniqueName: \"kubernetes.io/projected/38b619a1-d7c9-47a6-8014-5cf88416ffa9-kube-api-access-qf9ck\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:57.019835 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:57.019813 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/38b619a1-d7c9-47a6-8014-5cf88416ffa9-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 21:57:58.688715 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:57:58.688674 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" path="/var/lib/kubelet/pods/38b619a1-d7c9-47a6-8014-5cf88416ffa9/volumes" Apr 24 21:58:00.185257 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185217 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185519 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185530 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185540 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185545 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185557 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="storage-initializer" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185563 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="storage-initializer" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185571 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="llm-d-routing-sidecar" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185580 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="llm-d-routing-sidecar" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185593 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185601 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185613 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="storage-initializer" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185622 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="storage-initializer" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185629 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="storage-initializer" Apr 24 21:58:00.185642 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185637 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="storage-initializer" Apr 24 21:58:00.186198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185721 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="llm-d-routing-sidecar" Apr 24 21:58:00.186198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185734 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="de640863-17a1-45ba-bba0-ed8998e7bc74" containerName="main" Apr 24 21:58:00.186198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185747 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c0e89ca-8bc8-4ba7-9628-64cad681f832" containerName="main" Apr 24 21:58:00.186198 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.185757 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="38b619a1-d7c9-47a6-8014-5cf88416ffa9" containerName="main" Apr 24 21:58:00.190684 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.190657 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.193410 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.193386 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 24 21:58:00.193555 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.193472 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-2xflz\"" Apr 24 21:58:00.201299 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.201260 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 21:58:00.248012 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.247968 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.248192 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.248073 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.248192 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.248147 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.248192 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.248175 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8kk6m\" (UniqueName: \"kubernetes.io/projected/d1577797-df68-436b-9458-1060b565cf5c-kube-api-access-8kk6m\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.248349 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.248197 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1577797-df68-436b-9458-1060b565cf5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.248349 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.248243 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.248349 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.248292 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348687 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348643 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348687 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348696 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348717 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8kk6m\" (UniqueName: \"kubernetes.io/projected/d1577797-df68-436b-9458-1060b565cf5c-kube-api-access-8kk6m\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348751 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1577797-df68-436b-9458-1060b565cf5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348808 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348860 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.348952 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.348888 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.349235 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.349085 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.349314 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.349246 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.349314 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.349293 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.349423 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.349356 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.351332 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.351304 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1577797-df68-436b-9458-1060b565cf5c-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.351453 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.351330 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.368397 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.368325 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8kk6m\" (UniqueName: \"kubernetes.io/projected/d1577797-df68-436b-9458-1060b565cf5c-kube-api-access-8kk6m\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.502355 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.502258 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 21:58:00.644662 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.644581 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 21:58:00.648166 ip-10-0-133-36 kubenswrapper[2580]: W0424 21:58:00.648135 2580 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1577797_df68_436b_9458_1060b565cf5c.slice/crio-cfcf469b7a8322d59d1d06da3ee64b4a7cab157a7726e2eecf282d9224af9947 WatchSource:0}: Error finding container cfcf469b7a8322d59d1d06da3ee64b4a7cab157a7726e2eecf282d9224af9947: Status 404 returned error can't find the container with id cfcf469b7a8322d59d1d06da3ee64b4a7cab157a7726e2eecf282d9224af9947 Apr 24 21:58:00.974623 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.974583 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:58:00.997498 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.997461 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"d1577797-df68-436b-9458-1060b565cf5c","Type":"ContainerStarted","Data":"7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc"} Apr 24 21:58:00.997676 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:00.997505 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"d1577797-df68-436b-9458-1060b565cf5c","Type":"ContainerStarted","Data":"cfcf469b7a8322d59d1d06da3ee64b4a7cab157a7726e2eecf282d9224af9947"} Apr 24 21:58:10.975090 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:10.975055 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:58:20.975235 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:20.975189 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:58:30.974330 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:30.974278 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:58:40.975164 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:40.975114 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:58:50.974771 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:58:50.974661 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:59:00.974985 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:59:00.974936 2580 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" probeResult="failure" output="Get \"https://10.132.0.41:8000/health\": dial tcp 10.132.0.41:8000: connect: connection refused" Apr 24 21:59:10.985248 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:59:10.985207 2580 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 21:59:10.993378 ip-10-0-133-36 kubenswrapper[2580]: I0424 21:59:10.993351 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 22:02:28.753271 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:02:28.753231 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:02:28.759627 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:02:28.759602 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:02:28.762386 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:02:28.762354 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:02:28.766100 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:02:28.766079 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:07:28.779105 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:07:28.779071 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:07:28.782933 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:07:28.782909 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:07:28.785292 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:07:28.785270 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:07:28.789066 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:07:28.789047 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:11:24.209182 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.209068 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth"] Apr 24 22:11:24.211545 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.209419 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" containerID="cri-o://a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9" gracePeriod=30 Apr 24 22:11:24.222248 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.222216 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg"] Apr 24 22:11:24.222520 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.222498 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" containerID="cri-o://5ff0f6c9ab6f52c33dafb9f515ed9001b67fe36d40044d0e97362599861d640d" gracePeriod=30 Apr 24 22:11:24.222609 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.222556 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="storage-initializer" containerID="cri-o://5f21e0ee72f529af0f44a3a684ead0a0f9b1ba546b0e95e57ec361c61e621771" gracePeriod=30 Apr 24 22:11:24.634373 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.634339 2580 generic.go:358] "Generic (PLEG): container finished" podID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerID="5ff0f6c9ab6f52c33dafb9f515ed9001b67fe36d40044d0e97362599861d640d" exitCode=0 Apr 24 22:11:24.634554 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:24.634390 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" event={"ID":"14aed0c5-69c3-4390-902b-ef799b0f9c16","Type":"ContainerDied","Data":"5ff0f6c9ab6f52c33dafb9f515ed9001b67fe36d40044d0e97362599861d640d"} Apr 24 22:11:30.945373 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:30.945319 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:11:33.687654 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:33.687607 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:11:40.944756 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:40.944712 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:11:43.688676 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:43.688627 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:11:50.944981 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:50.944936 2580 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:11:50.945400 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:50.945065 2580 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 22:11:53.687788 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:53.687736 2580 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" probeResult="failure" output="Get \"https://10.132.0.40:8000/health\": dial tcp 10.132.0.40:8000: connect: connection refused" Apr 24 22:11:54.478430 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.478406 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 22:11:54.570421 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570386 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-dshm\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570421 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570426 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j2vbq\" (UniqueName: \"kubernetes.io/projected/677c14b9-087d-4b41-839f-cc6a7d126def-kube-api-access-j2vbq\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570675 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570457 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/677c14b9-087d-4b41-839f-cc6a7d126def-tls-certs\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570675 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570492 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-home\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570675 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570559 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-model-cache\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570675 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570583 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-kserve-provision-location\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570675 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570603 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-tmp-dir\") pod \"677c14b9-087d-4b41-839f-cc6a7d126def\" (UID: \"677c14b9-087d-4b41-839f-cc6a7d126def\") " Apr 24 22:11:54.570942 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.570818 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-model-cache" (OuterVolumeSpecName: "model-cache") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.571301 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.571273 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-home" (OuterVolumeSpecName: "home") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.572752 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.572724 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/677c14b9-087d-4b41-839f-cc6a7d126def-kube-api-access-j2vbq" (OuterVolumeSpecName: "kube-api-access-j2vbq") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "kube-api-access-j2vbq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:11:54.572858 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.572767 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-dshm" (OuterVolumeSpecName: "dshm") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.572858 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.572765 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/677c14b9-087d-4b41-839f-cc6a7d126def-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:11:54.583650 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.583619 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.625754 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.625716 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "677c14b9-087d-4b41-839f-cc6a7d126def" (UID: "677c14b9-087d-4b41-839f-cc6a7d126def"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.672140 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672108 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/677c14b9-087d-4b41-839f-cc6a7d126def-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.672140 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672138 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.672275 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672147 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.672275 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672156 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.672275 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672167 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.672275 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672175 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/677c14b9-087d-4b41-839f-cc6a7d126def-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.672275 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.672184 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j2vbq\" (UniqueName: \"kubernetes.io/projected/677c14b9-087d-4b41-839f-cc6a7d126def-kube-api-access-j2vbq\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.728896 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.728813 2580 generic.go:358] "Generic (PLEG): container finished" podID="677c14b9-087d-4b41-839f-cc6a7d126def" containerID="a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9" exitCode=137 Apr 24 22:11:54.728896 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.728894 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" Apr 24 22:11:54.729393 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.728893 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" event={"ID":"677c14b9-087d-4b41-839f-cc6a7d126def","Type":"ContainerDied","Data":"a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9"} Apr 24 22:11:54.729393 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.728942 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth" event={"ID":"677c14b9-087d-4b41-839f-cc6a7d126def","Type":"ContainerDied","Data":"f92ddbfa1ea382e8084aad9dc2d4b4118624e67c6f4d81b4eee1e9d1ef082ec5"} Apr 24 22:11:54.729393 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.728965 2580 scope.go:117] "RemoveContainer" containerID="a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9" Apr 24 22:11:54.730371 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.730350 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg_14aed0c5-69c3-4390-902b-ef799b0f9c16/storage-initializer/0.log" Apr 24 22:11:54.730685 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.730661 2580 generic.go:358] "Generic (PLEG): container finished" podID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerID="5f21e0ee72f529af0f44a3a684ead0a0f9b1ba546b0e95e57ec361c61e621771" exitCode=137 Apr 24 22:11:54.730786 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.730709 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" event={"ID":"14aed0c5-69c3-4390-902b-ef799b0f9c16","Type":"ContainerDied","Data":"5f21e0ee72f529af0f44a3a684ead0a0f9b1ba546b0e95e57ec361c61e621771"} Apr 24 22:11:54.745137 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.745108 2580 scope.go:117] "RemoveContainer" containerID="bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9" Apr 24 22:11:54.750807 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.750775 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth"] Apr 24 22:11:54.755987 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.755957 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-787d98cd89-7lzth"] Apr 24 22:11:54.809831 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.809806 2580 scope.go:117] "RemoveContainer" containerID="a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9" Apr 24 22:11:54.810206 ip-10-0-133-36 kubenswrapper[2580]: E0424 22:11:54.810182 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9\": container with ID starting with a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9 not found: ID does not exist" containerID="a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9" Apr 24 22:11:54.810303 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.810217 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9"} err="failed to get container status \"a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9\": rpc error: code = NotFound desc = could not find container \"a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9\": container with ID starting with a07c8aaee8473e0747472e3fc9e8e1d50c3c566437feeec36e7876e9831072f9 not found: ID does not exist" Apr 24 22:11:54.810303 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.810239 2580 scope.go:117] "RemoveContainer" containerID="bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9" Apr 24 22:11:54.810521 ip-10-0-133-36 kubenswrapper[2580]: E0424 22:11:54.810504 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9\": container with ID starting with bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9 not found: ID does not exist" containerID="bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9" Apr 24 22:11:54.810580 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.810525 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9"} err="failed to get container status \"bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9\": rpc error: code = NotFound desc = could not find container \"bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9\": container with ID starting with bba0f03f9cccbec7ec754908957accced397e96667e8d1c2e48592dd6329dda9 not found: ID does not exist" Apr 24 22:11:54.864494 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.864471 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg_14aed0c5-69c3-4390-902b-ef799b0f9c16/storage-initializer/0.log" Apr 24 22:11:54.864827 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.864809 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 22:11:54.974766 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.974725 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-model-cache\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.974766 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.974771 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-tmp-dir\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.975065 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.974804 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-dshm\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.975065 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.974824 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14aed0c5-69c3-4390-902b-ef799b0f9c16-tls-certs\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.975065 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.974841 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-kserve-provision-location\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.975065 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.974882 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-tc8cn\" (UniqueName: \"kubernetes.io/projected/14aed0c5-69c3-4390-902b-ef799b0f9c16-kube-api-access-tc8cn\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.975065 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975051 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-model-cache" (OuterVolumeSpecName: "model-cache") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.975333 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975063 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-home\") pod \"14aed0c5-69c3-4390-902b-ef799b0f9c16\" (UID: \"14aed0c5-69c3-4390-902b-ef799b0f9c16\") " Apr 24 22:11:54.975333 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975238 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.975333 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975311 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-home" (OuterVolumeSpecName: "home") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.975496 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975400 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.975496 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975420 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.975496 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.975438 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:54.976969 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.976951 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-dshm" (OuterVolumeSpecName: "dshm") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:54.977399 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.977377 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/14aed0c5-69c3-4390-902b-ef799b0f9c16-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:11:54.977399 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:54.977392 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/14aed0c5-69c3-4390-902b-ef799b0f9c16-kube-api-access-tc8cn" (OuterVolumeSpecName: "kube-api-access-tc8cn") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "kube-api-access-tc8cn". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:11:55.030116 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.029985 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "14aed0c5-69c3-4390-902b-ef799b0f9c16" (UID: "14aed0c5-69c3-4390-902b-ef799b0f9c16"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:11:55.076613 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.076579 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:55.076613 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.076609 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/14aed0c5-69c3-4390-902b-ef799b0f9c16-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:55.076613 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.076621 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/14aed0c5-69c3-4390-902b-ef799b0f9c16-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:55.076922 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.076630 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-tc8cn\" (UniqueName: \"kubernetes.io/projected/14aed0c5-69c3-4390-902b-ef799b0f9c16-kube-api-access-tc8cn\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:11:55.735464 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.735435 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg_14aed0c5-69c3-4390-902b-ef799b0f9c16/storage-initializer/0.log" Apr 24 22:11:55.735899 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.735852 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" Apr 24 22:11:55.735899 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.735861 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg" event={"ID":"14aed0c5-69c3-4390-902b-ef799b0f9c16","Type":"ContainerDied","Data":"53b7c21e404844a9e17e21539b2b57db985a7af790b0374b3fd5cc8c35a90120"} Apr 24 22:11:55.736055 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.735915 2580 scope.go:117] "RemoveContainer" containerID="5f21e0ee72f529af0f44a3a684ead0a0f9b1ba546b0e95e57ec361c61e621771" Apr 24 22:11:55.764878 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.764855 2580 scope.go:117] "RemoveContainer" containerID="5ff0f6c9ab6f52c33dafb9f515ed9001b67fe36d40044d0e97362599861d640d" Apr 24 22:11:55.783507 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.783478 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg"] Apr 24 22:11:55.788274 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:55.788244 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-5c8d49b9d-gvqfg"] Apr 24 22:11:56.688849 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:56.688816 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" path="/var/lib/kubelet/pods/14aed0c5-69c3-4390-902b-ef799b0f9c16/volumes" Apr 24 22:11:56.689264 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:11:56.689250 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" path="/var/lib/kubelet/pods/677c14b9-087d-4b41-839f-cc6a7d126def/volumes" Apr 24 22:12:28.800828 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:28.800740 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:12:28.804490 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:28.804465 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:12:28.807624 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:28.807606 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:12:28.811095 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:28.811079 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:12:29.203212 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:29.203176 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:12:29.203527 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:29.203478 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="d1577797-df68-436b-9458-1060b565cf5c" containerName="storage-initializer" containerID="cri-o://7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc" gracePeriod=30 Apr 24 22:12:59.387847 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.387820 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0_d1577797-df68-436b-9458-1060b565cf5c/storage-initializer/0.log" Apr 24 22:12:59.388219 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.387890 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:12:59.487430 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487324 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-home\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487430 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487378 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1577797-df68-436b-9458-1060b565cf5c-tls-certs\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487430 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487410 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-model-cache\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487452 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-tmp-dir\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487486 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-dshm\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487515 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-kserve-provision-location\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487539 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8kk6m\" (UniqueName: \"kubernetes.io/projected/d1577797-df68-436b-9458-1060b565cf5c-kube-api-access-8kk6m\") pod \"d1577797-df68-436b-9458-1060b565cf5c\" (UID: \"d1577797-df68-436b-9458-1060b565cf5c\") " Apr 24 22:12:59.487729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487717 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-home" (OuterVolumeSpecName: "home") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:59.487918 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487736 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-model-cache" (OuterVolumeSpecName: "model-cache") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:59.487918 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.487831 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:59.489805 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.489775 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-dshm" (OuterVolumeSpecName: "dshm") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:59.489937 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.489838 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1577797-df68-436b-9458-1060b565cf5c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 24 22:12:59.489937 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.489854 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1577797-df68-436b-9458-1060b565cf5c-kube-api-access-8kk6m" (OuterVolumeSpecName: "kube-api-access-8kk6m") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "kube-api-access-8kk6m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:12:59.532774 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.532723 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d1577797-df68-436b-9458-1060b565cf5c" (UID: "d1577797-df68-436b-9458-1060b565cf5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:12:59.588225 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588183 2580 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-model-cache\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.588225 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588217 2580 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-tmp-dir\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.588225 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588225 2580 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-dshm\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.588225 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588235 2580 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-kserve-provision-location\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.588585 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588247 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8kk6m\" (UniqueName: \"kubernetes.io/projected/d1577797-df68-436b-9458-1060b565cf5c-kube-api-access-8kk6m\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.588585 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588256 2580 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1577797-df68-436b-9458-1060b565cf5c-home\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.588585 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.588265 2580 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1577797-df68-436b-9458-1060b565cf5c-tls-certs\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:12:59.947729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.947702 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0_d1577797-df68-436b-9458-1060b565cf5c/storage-initializer/0.log" Apr 24 22:12:59.947904 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.947746 2580 generic.go:358] "Generic (PLEG): container finished" podID="d1577797-df68-436b-9458-1060b565cf5c" containerID="7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc" exitCode=137 Apr 24 22:12:59.947904 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.947827 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 24 22:12:59.947904 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.947839 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"d1577797-df68-436b-9458-1060b565cf5c","Type":"ContainerDied","Data":"7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc"} Apr 24 22:12:59.947904 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.947881 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"d1577797-df68-436b-9458-1060b565cf5c","Type":"ContainerDied","Data":"cfcf469b7a8322d59d1d06da3ee64b4a7cab157a7726e2eecf282d9224af9947"} Apr 24 22:12:59.947904 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.947900 2580 scope.go:117] "RemoveContainer" containerID="7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc" Apr 24 22:12:59.978890 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.978864 2580 scope.go:117] "RemoveContainer" containerID="7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc" Apr 24 22:12:59.979277 ip-10-0-133-36 kubenswrapper[2580]: E0424 22:12:59.979233 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc\": container with ID starting with 7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc not found: ID does not exist" containerID="7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc" Apr 24 22:12:59.979347 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.979283 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc"} err="failed to get container status \"7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc\": rpc error: code = NotFound desc = could not find container \"7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc\": container with ID starting with 7e3dba6afe8cfaede5784525a0ac9f728ec3fa15479eb07c5a97e459e11c05dc not found: ID does not exist" Apr 24 22:12:59.986490 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.986461 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:12:59.992407 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:12:59.992353 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 24 22:13:00.688484 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:13:00.688445 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1577797-df68-436b-9458-1060b565cf5c" path="/var/lib/kubelet/pods/d1577797-df68-436b-9458-1060b565cf5c/volumes" Apr 24 22:15:49.644389 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644353 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-5ch7l/must-gather-d5h6m"] Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644633 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644644 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644655 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1577797-df68-436b-9458-1060b565cf5c" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644661 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1577797-df68-436b-9458-1060b565cf5c" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644666 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644672 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644682 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644689 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644701 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644706 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644759 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="677c14b9-087d-4b41-839f-cc6a7d126def" containerName="main" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644780 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="llm-d-routing-sidecar" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644788 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="14aed0c5-69c3-4390-902b-ef799b0f9c16" containerName="storage-initializer" Apr 24 22:15:49.644797 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.644794 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1577797-df68-436b-9458-1060b565cf5c" containerName="storage-initializer" Apr 24 22:15:49.647594 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.647576 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.649844 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.649821 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5ch7l\"/\"openshift-service-ca.crt\"" Apr 24 22:15:49.650034 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.649904 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-5ch7l\"/\"kube-root-ca.crt\"" Apr 24 22:15:49.650665 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.650649 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-5ch7l\"/\"default-dockercfg-h9ln4\"" Apr 24 22:15:49.655482 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.655460 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5ch7l/must-gather-d5h6m"] Apr 24 22:15:49.748861 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.748829 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/620ff3f4-7601-4fa8-989c-b5a98f2cc884-must-gather-output\") pod \"must-gather-d5h6m\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.749049 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.748876 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4m6rb\" (UniqueName: \"kubernetes.io/projected/620ff3f4-7601-4fa8-989c-b5a98f2cc884-kube-api-access-4m6rb\") pod \"must-gather-d5h6m\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.849299 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.849267 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/620ff3f4-7601-4fa8-989c-b5a98f2cc884-must-gather-output\") pod \"must-gather-d5h6m\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.849457 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.849316 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4m6rb\" (UniqueName: \"kubernetes.io/projected/620ff3f4-7601-4fa8-989c-b5a98f2cc884-kube-api-access-4m6rb\") pod \"must-gather-d5h6m\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.849600 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.849579 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/620ff3f4-7601-4fa8-989c-b5a98f2cc884-must-gather-output\") pod \"must-gather-d5h6m\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.858067 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.858047 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4m6rb\" (UniqueName: \"kubernetes.io/projected/620ff3f4-7601-4fa8-989c-b5a98f2cc884-kube-api-access-4m6rb\") pod \"must-gather-d5h6m\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:49.957652 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:49.957581 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:15:50.086488 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:50.086389 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-5ch7l/must-gather-d5h6m"] Apr 24 22:15:50.086730 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:50.086711 2580 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 24 22:15:50.451268 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:50.451224 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" event={"ID":"620ff3f4-7601-4fa8-989c-b5a98f2cc884","Type":"ContainerStarted","Data":"be25dde0e2ecd0b89e717730f52e62cf09d81be6af29e4b9f8c2326e620d9c86"} Apr 24 22:15:54.466386 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:54.466348 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" event={"ID":"620ff3f4-7601-4fa8-989c-b5a98f2cc884","Type":"ContainerStarted","Data":"c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111"} Apr 24 22:15:54.466386 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:54.466391 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" event={"ID":"620ff3f4-7601-4fa8-989c-b5a98f2cc884","Type":"ContainerStarted","Data":"f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3"} Apr 24 22:15:54.483739 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:15:54.483650 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" podStartSLOduration=1.557015303 podStartE2EDuration="5.483630531s" podCreationTimestamp="2026-04-24 22:15:49 +0000 UTC" firstStartedPulling="2026-04-24 22:15:50.086861642 +0000 UTC m=+2901.884924589" lastFinishedPulling="2026-04-24 22:15:54.013476869 +0000 UTC m=+2905.811539817" observedRunningTime="2026-04-24 22:15:54.482777588 +0000 UTC m=+2906.280840559" watchObservedRunningTime="2026-04-24 22:15:54.483630531 +0000 UTC m=+2906.281693528" Apr 24 22:16:19.449097 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:19.449070 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7f4c79d4bd-dxrjx_a318551b-2eb1-436a-8979-f4e740c2e662/router/0.log" Apr 24 22:16:20.332946 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:20.332913 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7f4c79d4bd-dxrjx_a318551b-2eb1-436a-8979-f4e740c2e662/router/0.log" Apr 24 22:16:21.208527 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:21.208493 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-bvg5d_40b0cb6d-5360-4969-b911-a4d5822d26e9/kuadrant-console-plugin/0.log" Apr 24 22:16:21.259291 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:21.259265 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-mrc5l_7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040/limitador/0.log" Apr 24 22:16:22.556074 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:22.556036 2580 generic.go:358] "Generic (PLEG): container finished" podID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerID="f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3" exitCode=0 Apr 24 22:16:22.556541 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:22.556116 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" event={"ID":"620ff3f4-7601-4fa8-989c-b5a98f2cc884","Type":"ContainerDied","Data":"f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3"} Apr 24 22:16:22.556541 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:22.556450 2580 scope.go:117] "RemoveContainer" containerID="f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3" Apr 24 22:16:22.990489 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:22.990461 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5ch7l_must-gather-d5h6m_620ff3f4-7601-4fa8-989c-b5a98f2cc884/gather/0.log" Apr 24 22:16:26.675021 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:26.674978 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-m4t52_74148c1b-8ecf-4750-b22d-3bb17904b081/global-pull-secret-syncer/0.log" Apr 24 22:16:26.847774 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:26.847744 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-rvnzj_9f68a401-0090-4e11-a8d5-8ba136fddbec/konnectivity-agent/0.log" Apr 24 22:16:26.868948 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:26.868929 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-36.ec2.internal_4fcab00268e8e626e83b344c210c25fb/haproxy/0.log" Apr 24 22:16:28.494698 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.494663 2580 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-5ch7l/must-gather-d5h6m"] Apr 24 22:16:28.495162 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.494888 2580 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="copy" containerID="cri-o://c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111" gracePeriod=2 Apr 24 22:16:28.499426 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.498945 2580 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-5ch7l/must-gather-d5h6m"] Apr 24 22:16:28.688777 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.688738 2580 status_manager.go:895] "Failed to get status for pod" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" err="pods \"must-gather-d5h6m\" is forbidden: User \"system:node:ip-10-0-133-36.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-5ch7l\": no relationship found between node 'ip-10-0-133-36.ec2.internal' and this object" Apr 24 22:16:28.723143 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.723124 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-5ch7l_must-gather-d5h6m_620ff3f4-7601-4fa8-989c-b5a98f2cc884/copy/0.log" Apr 24 22:16:28.723475 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.723460 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:16:28.779847 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.779778 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/620ff3f4-7601-4fa8-989c-b5a98f2cc884-must-gather-output\") pod \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " Apr 24 22:16:28.779967 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.779868 2580 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4m6rb\" (UniqueName: \"kubernetes.io/projected/620ff3f4-7601-4fa8-989c-b5a98f2cc884-kube-api-access-4m6rb\") pod \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\" (UID: \"620ff3f4-7601-4fa8-989c-b5a98f2cc884\") " Apr 24 22:16:28.781959 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.781928 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/620ff3f4-7601-4fa8-989c-b5a98f2cc884-kube-api-access-4m6rb" (OuterVolumeSpecName: "kube-api-access-4m6rb") pod "620ff3f4-7601-4fa8-989c-b5a98f2cc884" (UID: "620ff3f4-7601-4fa8-989c-b5a98f2cc884"). InnerVolumeSpecName "kube-api-access-4m6rb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 24 22:16:28.788046 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.788022 2580 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/620ff3f4-7601-4fa8-989c-b5a98f2cc884-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "620ff3f4-7601-4fa8-989c-b5a98f2cc884" (UID: "620ff3f4-7601-4fa8-989c-b5a98f2cc884"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 24 22:16:28.881200 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.881173 2580 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4m6rb\" (UniqueName: \"kubernetes.io/projected/620ff3f4-7601-4fa8-989c-b5a98f2cc884-kube-api-access-4m6rb\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:16:28.881200 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:28.881196 2580 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/620ff3f4-7601-4fa8-989c-b5a98f2cc884-must-gather-output\") on node \"ip-10-0-133-36.ec2.internal\" DevicePath \"\"" Apr 24 22:16:29.583587 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.583551 2580 generic.go:358] "Generic (PLEG): container finished" podID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerID="c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111" exitCode=143 Apr 24 22:16:29.584046 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.583608 2580 scope.go:117] "RemoveContainer" containerID="c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111" Apr 24 22:16:29.584046 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.583617 2580 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-5ch7l/must-gather-d5h6m" Apr 24 22:16:29.591593 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.591304 2580 scope.go:117] "RemoveContainer" containerID="f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3" Apr 24 22:16:29.605310 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.605287 2580 scope.go:117] "RemoveContainer" containerID="c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111" Apr 24 22:16:29.605593 ip-10-0-133-36 kubenswrapper[2580]: E0424 22:16:29.605570 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111\": container with ID starting with c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111 not found: ID does not exist" containerID="c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111" Apr 24 22:16:29.605663 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.605607 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111"} err="failed to get container status \"c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111\": rpc error: code = NotFound desc = could not find container \"c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111\": container with ID starting with c16519087f1d496bc3a35a5827dd292ffaf78355b7872b1f6398444c5a087111 not found: ID does not exist" Apr 24 22:16:29.605663 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.605635 2580 scope.go:117] "RemoveContainer" containerID="f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3" Apr 24 22:16:29.605866 ip-10-0-133-36 kubenswrapper[2580]: E0424 22:16:29.605851 2580 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3\": container with ID starting with f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3 not found: ID does not exist" containerID="f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3" Apr 24 22:16:29.605920 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:29.605875 2580 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3"} err="failed to get container status \"f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3\": rpc error: code = NotFound desc = could not find container \"f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3\": container with ID starting with f0ef9d99d5a39f2d92d723679a75e77da2bef57a6b3803215531ee09fc7f2bf3 not found: ID does not exist" Apr 24 22:16:30.688699 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:30.688654 2580 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" path="/var/lib/kubelet/pods/620ff3f4-7601-4fa8-989c-b5a98f2cc884/volumes" Apr 24 22:16:30.982600 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:30.982502 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-bvg5d_40b0cb6d-5360-4969-b911-a4d5822d26e9/kuadrant-console-plugin/0.log" Apr 24 22:16:31.058209 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:31.058184 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-limitador-67566c68b4-mrc5l_7fd39c61-a41c-4b55-8bdd-3a5ecfb5e040/limitador/0.log" Apr 24 22:16:32.296565 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:32.296532 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-542j7_6cd08957-865f-4442-98d3-f5cd050c3fb6/cluster-monitoring-operator/0.log" Apr 24 22:16:32.474849 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:32.474819 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c5fkj_5d653d51-55b5-48dc-8ba4-a85466d08ff4/node-exporter/0.log" Apr 24 22:16:32.503581 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:32.503556 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c5fkj_5d653d51-55b5-48dc-8ba4-a85466d08ff4/kube-rbac-proxy/0.log" Apr 24 22:16:32.531045 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:32.531022 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-c5fkj_5d653d51-55b5-48dc-8ba4-a85466d08ff4/init-textfile/0.log" Apr 24 22:16:35.060177 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.060136 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/1.log" Apr 24 22:16:35.064640 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.064623 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-zxbmp_8ae26ae6-db56-4447-825f-208c0ab19d34/console-operator/2.log" Apr 24 22:16:35.273116 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273085 2580 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q"] Apr 24 22:16:35.273384 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273373 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="copy" Apr 24 22:16:35.273438 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273386 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="copy" Apr 24 22:16:35.273438 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273395 2580 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="gather" Apr 24 22:16:35.273438 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273401 2580 state_mem.go:107] "Deleted CPUSet assignment" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="gather" Apr 24 22:16:35.273532 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273466 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="copy" Apr 24 22:16:35.273532 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.273477 2580 memory_manager.go:356] "RemoveStaleState removing state" podUID="620ff3f4-7601-4fa8-989c-b5a98f2cc884" containerName="gather" Apr 24 22:16:35.276345 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.276325 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.278639 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.278612 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wmq6v\"/\"kube-root-ca.crt\"" Apr 24 22:16:35.279390 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.279361 2580 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-wmq6v\"/\"default-dockercfg-4l88c\"" Apr 24 22:16:35.279499 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.279361 2580 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-wmq6v\"/\"openshift-service-ca.crt\"" Apr 24 22:16:35.285686 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.285668 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q"] Apr 24 22:16:35.332827 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.332741 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-sys\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.332827 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.332788 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-proc\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.333051 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.332845 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-lib-modules\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.333051 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.332897 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-podres\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.333051 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.332924 2580 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7tsfr\" (UniqueName: \"kubernetes.io/projected/b85447d9-8184-4c4e-a443-0f6f07cdb589-kube-api-access-7tsfr\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434357 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434299 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-sys\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434357 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434367 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-proc\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434632 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434432 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-proc\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434632 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434448 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-sys\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434632 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434495 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-lib-modules\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434632 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434544 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-podres\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434632 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434577 2580 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7tsfr\" (UniqueName: \"kubernetes.io/projected/b85447d9-8184-4c4e-a443-0f6f07cdb589-kube-api-access-7tsfr\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434870 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434645 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-podres\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.434870 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.434660 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/b85447d9-8184-4c4e-a443-0f6f07cdb589-lib-modules\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.443630 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.443603 2580 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7tsfr\" (UniqueName: \"kubernetes.io/projected/b85447d9-8184-4c4e-a443-0f6f07cdb589-kube-api-access-7tsfr\") pod \"perf-node-gather-daemonset-hlb7q\" (UID: \"b85447d9-8184-4c4e-a443-0f6f07cdb589\") " pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.587045 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.586918 2580 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:35.713843 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:35.713262 2580 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q"] Apr 24 22:16:36.095801 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.095769 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-pl48h_f161940e-db46-4df4-9316-662b09a296c4/volume-data-source-validator/0.log" Apr 24 22:16:36.606015 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.605958 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" event={"ID":"b85447d9-8184-4c4e-a443-0f6f07cdb589","Type":"ContainerStarted","Data":"288dc9c456a737160eccdaf43ea5e814dfd1379eed371890e6ed487d6cbd621f"} Apr 24 22:16:36.606015 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.606013 2580 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" event={"ID":"b85447d9-8184-4c4e-a443-0f6f07cdb589","Type":"ContainerStarted","Data":"0b94c4b4a2217384f89c447cc56ad38df68f637143fe618a0a5996b1139004de"} Apr 24 22:16:36.606240 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.606095 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:36.624226 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.624176 2580 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" podStartSLOduration=1.624159691 podStartE2EDuration="1.624159691s" podCreationTimestamp="2026-04-24 22:16:35 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-24 22:16:36.622786775 +0000 UTC m=+2948.420849742" watchObservedRunningTime="2026-04-24 22:16:36.624159691 +0000 UTC m=+2948.422222657" Apr 24 22:16:36.841038 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.840987 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fp6xs_78c8f2d7-06fd-41fd-91a1-02af0d79bea4/dns/0.log" Apr 24 22:16:36.863525 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:36.863449 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-fp6xs_78c8f2d7-06fd-41fd-91a1-02af0d79bea4/kube-rbac-proxy/0.log" Apr 24 22:16:37.047086 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:37.047057 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-trd5d_fa996ab4-6084-48e7-92d1-518c14773d43/dns-node-resolver/0.log" Apr 24 22:16:37.544127 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:37.544099 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-pzhk2_b026702e-c0cf-4243-981d-4f64cfc8b0a0/node-ca/0.log" Apr 24 22:16:38.461738 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:38.461711 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7f4c79d4bd-dxrjx_a318551b-2eb1-436a-8979-f4e740c2e662/router/0.log" Apr 24 22:16:38.954216 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:38.954181 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-5gnht_545ca8a5-6c3a-4c0f-bd52-f71dfa7a1804/serve-healthcheck-canary/0.log" Apr 24 22:16:39.426958 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:39.426922 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w6q7n_fc9822dc-9a3d-4fdf-94b1-053fb0f0608b/insights-operator/0.log" Apr 24 22:16:39.427158 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:39.427065 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-w6q7n_fc9822dc-9a3d-4fdf-94b1-053fb0f0608b/insights-operator/1.log" Apr 24 22:16:39.449956 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:39.449924 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6ck65_e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a/kube-rbac-proxy/0.log" Apr 24 22:16:39.472326 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:39.472302 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6ck65_e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a/exporter/0.log" Apr 24 22:16:39.494864 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:39.494825 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-6ck65_e442ca2c-20fc-4efd-a8db-bf2f9d0ae75a/extractor/0.log" Apr 24 22:16:42.209519 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:42.209488 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-5774f66dc9-c6c4h_edfa84d5-b4ca-4238-b102-34b23c953972/manager/0.log" Apr 24 22:16:42.618636 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:42.618566 2580 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-wmq6v/perf-node-gather-daemonset-hlb7q" Apr 24 22:16:43.142987 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:43.142961 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-6dhp2_989e4967-b2c7-40ab-bf6b-df37a1303bf4/s3-init/0.log" Apr 24 22:16:48.465990 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:48.465958 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-74bgx_a4d3d8ce-54d9-4c28-8043-df84ae070d16/kube-storage-version-migrator-operator/1.log" Apr 24 22:16:48.467413 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:48.467361 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-74bgx_a4d3d8ce-54d9-4c28-8043-df84ae070d16/kube-storage-version-migrator-operator/0.log" Apr 24 22:16:49.571807 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.571777 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/kube-multus-additional-cni-plugins/0.log" Apr 24 22:16:49.596239 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.596214 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/egress-router-binary-copy/0.log" Apr 24 22:16:49.620829 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.620805 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/cni-plugins/0.log" Apr 24 22:16:49.643652 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.643629 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/bond-cni-plugin/0.log" Apr 24 22:16:49.665010 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.664965 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/routeoverride-cni/0.log" Apr 24 22:16:49.686747 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.686718 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/whereabouts-cni-bincopy/0.log" Apr 24 22:16:49.709241 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:49.709217 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-6hjtw_5a0ab547-ecd2-4df3-9477-0144645571ea/whereabouts-cni/0.log" Apr 24 22:16:50.073679 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:50.073650 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bvrnt_95d3f7fe-3212-4144-bef0-8f34cd69da83/kube-multus/0.log" Apr 24 22:16:50.103135 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:50.103106 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c8k6b_dff89703-eb5c-40dd-b22c-a598308414bc/network-metrics-daemon/0.log" Apr 24 22:16:50.146551 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:50.146523 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-c8k6b_dff89703-eb5c-40dd-b22c-a598308414bc/kube-rbac-proxy/0.log" Apr 24 22:16:51.062191 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.062112 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-controller/0.log" Apr 24 22:16:51.083203 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.083172 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/0.log" Apr 24 22:16:51.095729 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.095705 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovn-acl-logging/1.log" Apr 24 22:16:51.116493 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.116465 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/kube-rbac-proxy-node/0.log" Apr 24 22:16:51.139716 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.139690 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/kube-rbac-proxy-ovn-metrics/0.log" Apr 24 22:16:51.162238 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.162215 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/northd/0.log" Apr 24 22:16:51.184065 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.184044 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/nbdb/0.log" Apr 24 22:16:51.206761 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.206737 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/sbdb/0.log" Apr 24 22:16:51.303150 ip-10-0-133-36 kubenswrapper[2580]: I0424 22:16:51.303115 2580 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-bt4zz_33177b79-148a-414b-a4ea-c1c5c4ff4faf/ovnkube-controller/0.log"