Apr 23 13:30:57.620414 ip-10-0-133-33 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:30:58.012906 ip-10-0-133-33 kubenswrapper[2581]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:30:58.012906 ip-10-0-133-33 kubenswrapper[2581]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:30:58.012906 ip-10-0-133-33 kubenswrapper[2581]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:30:58.012906 ip-10-0-133-33 kubenswrapper[2581]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:30:58.012906 ip-10-0-133-33 kubenswrapper[2581]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:30:58.014233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.014137 2581 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:30:58.017071 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017056 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:30:58.017071 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017072 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017076 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017079 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017082 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017086 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017089 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017091 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017094 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017097 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017099 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017102 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017112 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017115 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017117 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017120 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017122 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017125 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017128 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017130 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:30:58.017131 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017133 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017137 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017142 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017144 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017147 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017162 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017165 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017168 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017170 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017173 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017175 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017178 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017181 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017183 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017186 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017189 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017191 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017194 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017198 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017200 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:30:58.017591 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017203 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017206 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017208 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017212 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017216 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017220 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017223 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017225 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017228 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017230 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017232 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017235 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017237 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017240 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017243 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017246 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017249 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017252 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017254 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:30:58.018134 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017257 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017260 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017264 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017266 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017269 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017272 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017274 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017277 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017279 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017282 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017285 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017288 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017290 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017292 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017295 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017298 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017300 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017303 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017305 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017308 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:30:58.018606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017311 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:30:58.019077 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017313 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:30:58.019077 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017316 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:30:58.019077 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017318 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:30:58.019077 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017320 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:30:58.019077 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017323 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:30:58.019077 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.017325 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:30:58.019527 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019515 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:30:58.019527 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019526 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019530 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019533 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019536 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019539 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019542 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019545 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019548 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019550 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019553 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019556 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019558 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019561 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019564 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019566 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019568 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019571 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019573 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019576 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019578 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:30:58.019581 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019581 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019583 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019586 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019596 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019599 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019601 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019603 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019608 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019611 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019613 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019616 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019619 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019621 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019623 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019626 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019628 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019631 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019633 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019635 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:30:58.020069 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019638 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019640 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019642 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019645 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019647 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019649 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019652 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019654 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019656 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019659 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019661 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019663 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019666 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019668 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019670 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019674 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019677 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019685 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019687 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019690 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:30:58.020564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019693 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019695 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019697 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019700 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019702 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019704 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019707 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019709 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019711 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019714 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019716 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019720 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019724 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019727 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019730 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019732 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019735 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019738 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019741 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:30:58.021068 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019743 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019745 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019748 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019750 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019753 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019755 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.019758 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019839 2581 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019847 2581 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019860 2581 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019865 2581 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019875 2581 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019879 2581 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019883 2581 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019888 2581 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019891 2581 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019897 2581 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019901 2581 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019904 2581 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019907 2581 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019910 2581 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019913 2581 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019916 2581 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:30:58.021548 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019919 2581 flags.go:64] FLAG: --cloud-config="" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019921 2581 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019924 2581 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019930 2581 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019933 2581 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019936 2581 flags.go:64] FLAG: --config-dir="" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019939 2581 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019943 2581 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019946 2581 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019949 2581 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019953 2581 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019956 2581 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019959 2581 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019970 2581 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019973 2581 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019977 2581 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019980 2581 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019985 2581 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019988 2581 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019991 2581 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.019993 2581 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020004 2581 flags.go:64] FLAG: --enable-server="true" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020007 2581 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020015 2581 flags.go:64] FLAG: --event-burst="100" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020018 2581 flags.go:64] FLAG: --event-qps="50" Apr 23 13:30:58.022087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020021 2581 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020024 2581 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020026 2581 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020030 2581 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020033 2581 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020036 2581 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020039 2581 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020042 2581 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020045 2581 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020047 2581 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020050 2581 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020053 2581 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020056 2581 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020059 2581 flags.go:64] FLAG: --feature-gates="" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020063 2581 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020066 2581 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020069 2581 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020072 2581 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020075 2581 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020078 2581 flags.go:64] FLAG: --help="false" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020081 2581 flags.go:64] FLAG: --hostname-override="ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020085 2581 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020087 2581 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:30:58.022713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020090 2581 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020095 2581 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020098 2581 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020101 2581 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020103 2581 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020106 2581 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020115 2581 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020118 2581 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020121 2581 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020124 2581 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020127 2581 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020130 2581 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020133 2581 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020135 2581 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020138 2581 flags.go:64] FLAG: --lock-file="" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020141 2581 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020144 2581 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020146 2581 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020166 2581 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020174 2581 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020177 2581 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020182 2581 flags.go:64] FLAG: --logging-format="text" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020187 2581 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020192 2581 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:30:58.023268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020197 2581 flags.go:64] FLAG: --manifest-url="" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020202 2581 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020209 2581 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020212 2581 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020216 2581 flags.go:64] FLAG: --max-pods="110" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020219 2581 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020222 2581 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020225 2581 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020228 2581 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020231 2581 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020234 2581 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020237 2581 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020245 2581 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020248 2581 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020251 2581 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020260 2581 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020263 2581 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020274 2581 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020278 2581 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020284 2581 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020302 2581 flags.go:64] FLAG: --port="10250" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020305 2581 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020308 2581 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-0ecbeea42dae5f6a2" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020311 2581 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:30:58.023855 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020314 2581 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020317 2581 flags.go:64] FLAG: --register-node="true" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020320 2581 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020323 2581 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020327 2581 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020330 2581 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020332 2581 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020335 2581 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020339 2581 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020342 2581 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020345 2581 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020348 2581 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020351 2581 flags.go:64] FLAG: --runonce="false" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020354 2581 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020357 2581 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020361 2581 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020366 2581 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020370 2581 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020376 2581 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020381 2581 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020384 2581 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020386 2581 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020389 2581 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020392 2581 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020401 2581 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020404 2581 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:30:58.024490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020407 2581 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020410 2581 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020415 2581 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020418 2581 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020421 2581 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020427 2581 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020430 2581 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020433 2581 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020436 2581 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020439 2581 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020441 2581 flags.go:64] FLAG: --v="2" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020448 2581 flags.go:64] FLAG: --version="false" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020455 2581 flags.go:64] FLAG: --vmodule="" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020462 2581 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.020467 2581 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020600 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020604 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020607 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020610 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020614 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020619 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020623 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020627 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:30:58.025092 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020631 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020634 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020637 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020639 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020641 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020644 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020646 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020649 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020657 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020660 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020662 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020665 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020667 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020669 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020672 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020674 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020677 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020679 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020682 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020684 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:30:58.025658 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020687 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020689 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020692 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020699 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020703 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020708 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020711 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020715 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020718 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020720 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020723 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020725 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020728 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020730 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020733 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020736 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020738 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020740 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020743 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020745 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:30:58.026164 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020748 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020757 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020760 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020763 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020765 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020767 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020770 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020772 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020775 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020779 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020783 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020790 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020795 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020800 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020803 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020808 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020811 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020814 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020817 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:30:58.026941 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020819 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020823 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020826 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020829 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020832 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020834 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020836 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020839 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020841 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020844 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020847 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020849 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020851 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020854 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020856 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020866 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020870 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020874 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:30:58.027606 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.020878 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:30:58.028084 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.021436 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:30:58.030021 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.030003 2581 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:30:58.030063 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.030022 2581 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030067 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030073 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030077 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030081 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030084 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030087 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030090 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030093 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030096 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:30:58.030094 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030099 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030101 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030104 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030106 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030109 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030111 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030114 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030116 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030119 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030121 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030124 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030126 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030129 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030132 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030137 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030140 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030142 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030145 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030147 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030163 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:30:58.030353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030165 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030168 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030170 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030173 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030176 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030178 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030181 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030183 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030186 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030188 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030191 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030193 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030195 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030198 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030202 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030204 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030207 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030209 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030212 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030214 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:30:58.030834 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030217 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030220 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030222 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030225 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030227 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030230 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030232 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030235 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030237 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030240 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030242 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030245 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030247 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030249 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030252 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030254 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030257 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030259 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030262 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030264 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:30:58.031320 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030266 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030268 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030271 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030273 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030276 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030278 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030281 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030284 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030287 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030289 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030292 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030294 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030296 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030299 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030301 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030304 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:30:58.031807 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030306 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.030311 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030401 2581 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030406 2581 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030409 2581 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030412 2581 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030416 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030418 2581 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030421 2581 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030423 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030426 2581 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030428 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030431 2581 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030434 2581 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030438 2581 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:30:58.032207 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030441 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030443 2581 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030446 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030448 2581 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030451 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030453 2581 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030456 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030458 2581 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030462 2581 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030465 2581 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030468 2581 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030470 2581 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030473 2581 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030475 2581 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030478 2581 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030480 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030483 2581 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030485 2581 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030488 2581 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030491 2581 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:30:58.032564 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030493 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030496 2581 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030498 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030501 2581 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030503 2581 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030505 2581 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030508 2581 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030512 2581 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030516 2581 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030519 2581 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030522 2581 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030524 2581 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030527 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030530 2581 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030532 2581 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030534 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030537 2581 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030540 2581 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030542 2581 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030545 2581 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:30:58.033046 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030547 2581 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030550 2581 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030553 2581 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030556 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030558 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030560 2581 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030563 2581 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030565 2581 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030568 2581 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030570 2581 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030572 2581 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030575 2581 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030577 2581 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030580 2581 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030582 2581 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030584 2581 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030587 2581 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030589 2581 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030592 2581 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030594 2581 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:30:58.033610 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030597 2581 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030599 2581 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030602 2581 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030604 2581 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030606 2581 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030609 2581 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030611 2581 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030613 2581 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030616 2581 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030618 2581 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030620 2581 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030623 2581 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:58.030625 2581 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.030630 2581 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:30:58.034087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.031197 2581 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:30:58.034444 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.034208 2581 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:30:58.035075 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.035062 2581 server.go:1019] "Starting client certificate rotation" Apr 23 13:30:58.035195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.035180 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:30:58.035230 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.035221 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:30:58.057441 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.057423 2581 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:30:58.060125 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.060095 2581 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:30:58.071387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.071365 2581 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:30:58.076103 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.076085 2581 log.go:25] "Validated CRI v1 image API" Apr 23 13:30:58.078988 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.078974 2581 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:30:58.081911 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.081891 2581 fs.go:135] Filesystem UUIDs: map[17d25415-0c91-4dcd-9b5e-bb9c7136cc19:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2 e7dbee43-486e-4bdd-a882-d036fb0e8e62:/dev/nvme0n1p3] Apr 23 13:30:58.081969 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.081911 2581 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:30:58.083511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.083490 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:30:58.088666 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.088562 2581 manager.go:217] Machine: {Timestamp:2026-04-23 13:30:58.086076316 +0000 UTC m=+0.358005641 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3092220 MemoryCapacity:32812175360 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec27f388b64af83ffb081bf9e508ec70 SystemUUID:ec27f388-b64a-f83f-fb08-1bf9e508ec70 BootID:58241a08-7a91-45ba-bb50-a8e94f4a25d7 Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16406085632 Type:vfs Inodes:4005392 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6562435072 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16406089728 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:2a:74:1e:7b:c1 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:2a:74:1e:7b:c1 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:e2:33:4c:8c:cc:08 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:32812175360 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:34603008 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:30:58.088666 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.088661 2581 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:30:58.088771 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.088741 2581 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:30:58.089106 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.089086 2581 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:30:58.089258 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.089108 2581 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-133-33.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:30:58.089302 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.089269 2581 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:30:58.089302 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.089277 2581 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:30:58.089302 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.089291 2581 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:30:58.090044 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.090033 2581 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:30:58.091682 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.091671 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:30:58.091799 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.091790 2581 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:30:58.093838 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.093828 2581 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:30:58.093874 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.093842 2581 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:30:58.093874 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.093854 2581 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:30:58.093874 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.093863 2581 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:30:58.093874 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.093871 2581 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:30:58.094885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.094873 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:30:58.094944 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.094890 2581 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:30:58.098428 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.098411 2581 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:30:58.099566 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.099553 2581 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:30:58.101499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101485 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:30:58.101537 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101519 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:30:58.101537 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101528 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:30:58.101537 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101534 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101540 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101546 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101551 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101556 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101564 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101570 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101584 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:30:58.101624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.101593 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:30:58.102930 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.102921 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:30:58.102966 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.102932 2581 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:30:58.106775 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.106668 2581 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:30:58.106775 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.106705 2581 server.go:1295] "Started kubelet" Apr 23 13:30:58.106893 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.106765 2581 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:30:58.107329 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.107284 2581 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:30:58.107400 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.107359 2581 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:30:58.107606 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.107579 2581 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-133-33.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:30:58.107657 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.107640 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:30:58.107670 ip-10-0-133-33 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:30:58.107761 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.107689 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-133-33.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:30:58.108380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.108329 2581 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:30:58.109326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.109310 2581 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:30:58.112408 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.112387 2581 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:30:58.112497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.112394 2581 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:30:58.114460 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.114437 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.114696 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.114677 2581 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:30:58.114792 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.114784 2581 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:30:58.115119 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.114712 2581 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:30:58.115241 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.115124 2581 factory.go:55] Registering systemd factory Apr 23 13:30:58.115241 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.115136 2581 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:30:58.115355 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.115336 2581 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:30:58.115355 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.115346 2581 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:30:58.115453 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.115434 2581 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:30:58.119028 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.119014 2581 factory.go:153] Registering CRI-O factory Apr 23 13:30:58.119104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.119031 2581 factory.go:223] Registration of the crio container factory successfully Apr 23 13:30:58.119104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.119047 2581 factory.go:103] Registering Raw factory Apr 23 13:30:58.119104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.119062 2581 manager.go:1196] Started watching for new ooms in manager Apr 23 13:30:58.119478 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.119465 2581 manager.go:319] Starting recovery of all containers Apr 23 13:30:58.119825 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.113620 2581 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-133-33.ec2.internal.18a8ff9299b40289 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-133-33.ec2.internal,UID:ip-10-0-133-33.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-133-33.ec2.internal,},FirstTimestamp:2026-04-23 13:30:58.106679945 +0000 UTC m=+0.378609270,LastTimestamp:2026-04-23 13:30:58.106679945 +0000 UTC m=+0.378609270,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-133-33.ec2.internal,}" Apr 23 13:30:58.120727 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.120695 2581 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:30:58.124350 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.124183 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zdsc9" Apr 23 13:30:58.126475 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.126448 2581 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 13:30:58.126737 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.126572 2581 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-133-33.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 13:30:58.130390 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.130374 2581 manager.go:324] Recovery completed Apr 23 13:30:58.131432 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.131416 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-zdsc9" Apr 23 13:30:58.134105 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.134094 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:30:58.136697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.136681 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:30:58.136764 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.136710 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:30:58.136764 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.136720 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:30:58.137234 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.137216 2581 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:30:58.137234 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.137233 2581 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:30:58.137234 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.137248 2581 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:30:58.139117 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.139105 2581 policy_none.go:49] "None policy: Start" Apr 23 13:30:58.139174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.139123 2581 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:30:58.139174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.139133 2581 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:30:58.181525 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.181483 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.181697 2581 manager.go:341] "Starting Device Plugin manager" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.181722 2581 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.181732 2581 server.go:85] "Starting device plugin registration server" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.181933 2581 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.181944 2581 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182083 2581 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182175 2581 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182186 2581 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.182600 2581 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.182638 2581 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182875 2581 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182896 2581 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182910 2581 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.182917 2581 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.182946 2581 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:30:58.194486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.186376 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:30:58.282884 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.282826 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:30:58.283006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.282970 2581 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal"] Apr 23 13:30:58.283054 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283048 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:30:58.283912 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283882 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:30:58.284024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283917 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:30:58.284024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283884 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:30:58.284024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283931 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:30:58.284024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283951 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:30:58.284024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283971 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:30:58.284024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.283990 2581 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.286281 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.286265 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:30:58.286413 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.286394 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.286468 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.286425 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:30:58.287192 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.287177 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:30:58.287253 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.287200 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:30:58.287253 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.287213 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:30:58.287253 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.287180 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:30:58.287253 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.287245 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:30:58.287363 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.287258 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:30:58.289365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.289351 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.289434 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.289375 2581 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:30:58.290245 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.290228 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:30:58.290326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.290256 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:30:58.290326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.290268 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:30:58.295451 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.295434 2581 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.295526 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.295457 2581 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-133-33.ec2.internal\": node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.310253 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.310237 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.315678 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.315663 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-33.ec2.internal\" not found" node="ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.316227 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.316208 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6d6b5027305b2401681b1fe05011e3e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal\" (UID: \"d6d6b5027305b2401681b1fe05011e3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.316311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.316233 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6d6b5027305b2401681b1fe05011e3e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal\" (UID: \"d6d6b5027305b2401681b1fe05011e3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.316311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.316250 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/377b4825cda22d155cdffc4e7cfa1c2e-config\") pod \"kube-apiserver-proxy-ip-10-0-133-33.ec2.internal\" (UID: \"377b4825cda22d155cdffc4e7cfa1c2e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.319787 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.319772 2581 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-133-33.ec2.internal\" not found" node="ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.410758 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.410739 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.417011 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.416994 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6d6b5027305b2401681b1fe05011e3e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal\" (UID: \"d6d6b5027305b2401681b1fe05011e3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.417077 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.417017 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6d6b5027305b2401681b1fe05011e3e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal\" (UID: \"d6d6b5027305b2401681b1fe05011e3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.417077 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.417035 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/377b4825cda22d155cdffc4e7cfa1c2e-config\") pod \"kube-apiserver-proxy-ip-10-0-133-33.ec2.internal\" (UID: \"377b4825cda22d155cdffc4e7cfa1c2e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.417077 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.417069 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/377b4825cda22d155cdffc4e7cfa1c2e-config\") pod \"kube-apiserver-proxy-ip-10-0-133-33.ec2.internal\" (UID: \"377b4825cda22d155cdffc4e7cfa1c2e\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.417187 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.417089 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/d6d6b5027305b2401681b1fe05011e3e-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal\" (UID: \"d6d6b5027305b2401681b1fe05011e3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.417187 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.417103 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/d6d6b5027305b2401681b1fe05011e3e-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal\" (UID: \"d6d6b5027305b2401681b1fe05011e3e\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.511190 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.511165 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.611900 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.611868 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.617203 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.617181 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.622549 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.622533 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" Apr 23 13:30:58.712545 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.712506 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.813100 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.813066 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.913722 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:58.913646 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:58.989922 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:58.989899 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:30:59.014164 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:59.014132 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:59.034573 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.034550 2581 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:30:59.034679 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.034664 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:30:59.034728 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.034707 2581 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:30:59.113676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.113650 2581 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:30:59.114208 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:59.114193 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:59.126833 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.126816 2581 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:30:59.133467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.133440 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:25:58 +0000 UTC" deadline="2027-10-11 03:20:16.1616854 +0000 UTC" Apr 23 13:30:59.133467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.133462 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12853h49m17.028225987s" Apr 23 13:30:59.149024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.149001 2581 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-zbmsx" Apr 23 13:30:59.157032 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.157013 2581 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-zbmsx" Apr 23 13:30:59.214341 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:30:59.214279 2581 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-133-33.ec2.internal\" not found" Apr 23 13:30:59.250386 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.250347 2581 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:30:59.314961 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.314919 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" Apr 23 13:30:59.327879 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.327858 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:30:59.328706 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.328691 2581 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" Apr 23 13:30:59.335528 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.335509 2581 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:30:59.357522 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:59.357496 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd6d6b5027305b2401681b1fe05011e3e.slice/crio-6d990cf2410318f3413891932f55c01d12088403e8846565e0fe9c3c57e33457 WatchSource:0}: Error finding container 6d990cf2410318f3413891932f55c01d12088403e8846565e0fe9c3c57e33457: Status 404 returned error can't find the container with id 6d990cf2410318f3413891932f55c01d12088403e8846565e0fe9c3c57e33457 Apr 23 13:30:59.357782 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:30:59.357756 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod377b4825cda22d155cdffc4e7cfa1c2e.slice/crio-fa6c9b859557b4d33f7417a625ec62216dff6fe56dfffbdabb9e95192a247075 WatchSource:0}: Error finding container fa6c9b859557b4d33f7417a625ec62216dff6fe56dfffbdabb9e95192a247075: Status 404 returned error can't find the container with id fa6c9b859557b4d33f7417a625ec62216dff6fe56dfffbdabb9e95192a247075 Apr 23 13:30:59.362723 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.362707 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:30:59.466028 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.465978 2581 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:30:59.918595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:30:59.918403 2581 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:00.095312 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.095277 2581 apiserver.go:52] "Watching apiserver" Apr 23 13:31:00.102569 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.102412 2581 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:00.103624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.103598 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/network-metrics-daemon-fpksp","openshift-ovn-kubernetes/ovnkube-node-kgrc5","kube-system/konnectivity-agent-qnswk","kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r","openshift-image-registry/node-ca-6lrl9","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal","openshift-multus/multus-additional-cni-plugins-cskjl","openshift-network-diagnostics/network-check-target-749xq","openshift-network-operator/iptables-alerter-682sp","openshift-cluster-node-tuning-operator/tuned-8ddqs","openshift-multus/multus-vlh7k"] Apr 23 13:31:00.106873 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.106604 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.106873 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.106678 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:00.109130 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.109110 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.111661 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.111487 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:00.111757 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.111700 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cs55k\"" Apr 23 13:31:00.112098 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.112074 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:00.114266 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.114246 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.115230 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.115207 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:00.119174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.116884 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:00.119174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.117090 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:00.119174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.117358 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.119174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.117457 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:00.119174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.118182 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:00.119697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.119671 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:00.120274 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.120256 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:00.122104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.120561 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6vjsf\"" Apr 23 13:31:00.122104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.120791 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:00.122104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.120989 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:00.122104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.121277 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:00.122104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.121498 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.123893 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.122629 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.125134 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125117 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125862 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125889 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125682 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125712 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125788 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125747 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mrq52\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125802 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k28fs\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125811 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125843 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-j987r\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125890 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-systemd-units\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.125686 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126332 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-node-log\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126359 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-kubelet-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126383 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8t94c\" (UniqueName: \"kubernetes.io/projected/a48db93d-b4e4-47c7-b0f0-6499dce26319-kube-api-access-8t94c\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126404 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b566b258-1bd9-4623-93bd-ae1931a2bc34-host\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126432 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126454 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-ovn\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126484 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-cni-bin\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126508 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126532 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.128785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126582 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126608 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7f7g\" (UniqueName: \"kubernetes.io/projected/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-kube-api-access-h7f7g\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126642 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-etc-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126678 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-cni-netd\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/991e4c70-1167-498d-8923-054a866e840b-konnectivity-ca\") pod \"konnectivity-agent-qnswk\" (UID: \"991e4c70-1167-498d-8923-054a866e840b\") " pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126765 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-registration-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126801 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-device-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126825 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-var-lib-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126851 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-log-socket\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126873 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-socket-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126918 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-etc-selinux\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.126963 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-system-cni-dir\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127018 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-os-release\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127053 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127080 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbd9v\" (UniqueName: \"kubernetes.io/projected/c3fdc691-5051-4c4c-8360-ed987a28f315-kube-api-access-gbd9v\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127120 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-kubelet\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.130115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127144 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovnkube-config\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127186 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-env-overrides\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127209 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xdd2p\" (UniqueName: \"kubernetes.io/projected/b566b258-1bd9-4623-93bd-ae1931a2bc34-kube-api-access-xdd2p\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127245 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-run-netns\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127268 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-systemd\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127327 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovn-node-metrics-cert\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127356 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovnkube-script-lib\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127378 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/991e4c70-1167-498d-8923-054a866e840b-agent-certs\") pod \"konnectivity-agent-qnswk\" (UID: \"991e4c70-1167-498d-8923-054a866e840b\") " pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127402 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-slash\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127425 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-run-ovn-kubernetes\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ct4q2\" (UniqueName: \"kubernetes.io/projected/a139498f-5c4f-4db0-a95e-1d466b43fc87-kube-api-access-ct4q2\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127469 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-sys-fs\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127509 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b566b258-1bd9-4623-93bd-ae1931a2bc34-serviceca\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127545 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cnibin\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.127571 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.128428 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.128481 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.130729 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.131041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.130811 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.133112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.132769 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:00.133112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133071 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.133407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133254 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zg2qg\"" Apr 23 13:31:00.133407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133312 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:00.133407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133254 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tx9b2\"" Apr 23 13:31:00.133898 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133636 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:00.133898 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133694 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:00.133898 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.133703 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:00.134972 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.134952 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:00.135627 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.135395 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-s6685\"" Apr 23 13:31:00.158804 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.158778 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:25:59 +0000 UTC" deadline="2027-12-10 10:17:17.158905635 +0000 UTC" Apr 23 13:31:00.158804 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.158804 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14300h46m17.000103898s" Apr 23 13:31:00.187828 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.187759 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" event={"ID":"377b4825cda22d155cdffc4e7cfa1c2e","Type":"ContainerStarted","Data":"fa6c9b859557b4d33f7417a625ec62216dff6fe56dfffbdabb9e95192a247075"} Apr 23 13:31:00.188830 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.188804 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" event={"ID":"d6d6b5027305b2401681b1fe05011e3e","Type":"ContainerStarted","Data":"6d990cf2410318f3413891932f55c01d12088403e8846565e0fe9c3c57e33457"} Apr 23 13:31:00.217220 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.217136 2581 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:00.228754 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228725 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-run-netns\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.228894 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-systemd\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.228894 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228781 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovn-node-metrics-cert\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.228894 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228825 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-run-netns\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.228894 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228868 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b566b258-1bd9-4623-93bd-ae1931a2bc34-serviceca\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228896 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jllwj\" (UniqueName: \"kubernetes.io/projected/edc72f86-b958-4ee0-aa99-387d9ee23402-kube-api-access-jllwj\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228922 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-system-cni-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228947 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-k8s-cni-cncf-io\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.228974 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-kubelet-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229000 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8t94c\" (UniqueName: \"kubernetes.io/projected/a48db93d-b4e4-47c7-b0f0-6499dce26319-kube-api-access-8t94c\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229025 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-conf-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.229071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229050 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-ovn\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229072 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-cni-bin\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229098 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229117 2581 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229125 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229172 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-systemd\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229280 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edc72f86-b958-4ee0-aa99-387d9ee23402-host-slash\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229299 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/b566b258-1bd9-4623-93bd-ae1931a2bc34-serviceca\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.229327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229314 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-cni-netd\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229340 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/991e4c70-1167-498d-8923-054a866e840b-konnectivity-ca\") pod \"konnectivity-agent-qnswk\" (UID: \"991e4c70-1167-498d-8923-054a866e840b\") " pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229370 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/edc72f86-b958-4ee0-aa99-387d9ee23402-iptables-alerter-script\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229399 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-var-lib-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229429 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-socket-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229454 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-etc-selinux\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229479 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-system-cni-dir\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229507 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysctl-conf\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229535 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbd9v\" (UniqueName: \"kubernetes.io/projected/c3fdc691-5051-4c4c-8360-ed987a28f315-kube-api-access-gbd9v\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229562 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-host\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229587 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-os-release\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229591 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-var-lib-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229614 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-netns\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229628 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-cni-netd\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229639 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.229692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229639 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-daemon-config\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-socket-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229817 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-kubelet-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229815 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-cni-bin\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229853 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-system-cni-dir\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229900 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-etc-selinux\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229937 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-ovn\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229959 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/991e4c70-1167-498d-8923-054a866e840b-konnectivity-ca\") pod \"konnectivity-agent-qnswk\" (UID: \"991e4c70-1167-498d-8923-054a866e840b\") " pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.229977 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-kubelet\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230007 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230077 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovnkube-config\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230113 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xdd2p\" (UniqueName: \"kubernetes.io/projected/b566b258-1bd9-4623-93bd-ae1931a2bc34-kube-api-access-xdd2p\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-tuning-conf-dir\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230188 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-modprobe-d\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230193 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-kubelet\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230223 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-kubernetes\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230250 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-tuned\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.230533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-hostroot\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230318 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovnkube-script-lib\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230342 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/991e4c70-1167-498d-8923-054a866e840b-agent-certs\") pod \"konnectivity-agent-qnswk\" (UID: \"991e4c70-1167-498d-8923-054a866e840b\") " pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230424 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-run\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230455 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-slash\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-run-ovn-kubernetes\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230515 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ct4q2\" (UniqueName: \"kubernetes.io/projected/a139498f-5c4f-4db0-a95e-1d466b43fc87-kube-api-access-ct4q2\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230558 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-sys-fs\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230595 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cnibin\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230620 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-var-lib-kubelet\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230647 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4f9tq\" (UniqueName: \"kubernetes.io/projected/395ffea5-2185-4b70-b02f-0698dc276b8f-kube-api-access-4f9tq\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230703 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-cnibin\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230729 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-lib-modules\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230745 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovnkube-script-lib\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230760 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-slash\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230787 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cnibin\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230733 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovnkube-config\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.231333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230759 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230816 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-sys-fs\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230838 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-host-run-ovn-kubernetes\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230841 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-run-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230869 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.230907 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-systemd-units\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231022 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-node-log\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231049 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-systemd-units\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231107 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b566b258-1bd9-4623-93bd-ae1931a2bc34-host\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.230991 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231127 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-node-log\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231184 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231200 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/b566b258-1bd9-4623-93bd-ae1931a2bc34-host\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.231231 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:00.731198813 +0000 UTC m=+3.003128164 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231252 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-kubelet\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231325 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7f7g\" (UniqueName: \"kubernetes.io/projected/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-kube-api-access-h7f7g\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231356 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-cni-bin\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.232112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231414 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-etc-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231444 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-registration-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231491 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-device-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231522 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysconfig\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231535 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-etc-openvswitch\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231566 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-device-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231608 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/a48db93d-b4e4-47c7-b0f0-6499dce26319-registration-dir\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231641 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-systemd\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231654 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231668 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/395ffea5-2185-4b70-b02f-0698dc276b8f-tmp\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231704 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-multus-certs\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231731 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-etc-kubernetes\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-log-socket\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231778 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-os-release\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231793 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231797 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/a139498f-5c4f-4db0-a95e-1d466b43fc87-log-socket\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231810 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:00.232918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231847 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-os-release\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231922 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-25nv7\" (UniqueName: \"kubernetes.io/projected/bc88e01c-0268-427e-bd19-2df1ccdb32a0-kube-api-access-25nv7\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231946 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc88e01c-0268-427e-bd19-2df1ccdb32a0-cni-binary-copy\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231966 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-socket-dir-parent\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.231986 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-cni-multus\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.232015 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-env-overrides\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.232089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysctl-d\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.232121 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-sys\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.232145 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-cni-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.232368 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/a139498f-5c4f-4db0-a95e-1d466b43fc87-env-overrides\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.233737 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.232886 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/a139498f-5c4f-4db0-a95e-1d466b43fc87-ovn-node-metrics-cert\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.238530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.238473 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8t94c\" (UniqueName: \"kubernetes.io/projected/a48db93d-b4e4-47c7-b0f0-6499dce26319-kube-api-access-8t94c\") pod \"aws-ebs-csi-driver-node-84j8r\" (UID: \"a48db93d-b4e4-47c7-b0f0-6499dce26319\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.238712 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.238688 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbd9v\" (UniqueName: \"kubernetes.io/projected/c3fdc691-5051-4c4c-8360-ed987a28f315-kube-api-access-gbd9v\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.239821 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.239800 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xdd2p\" (UniqueName: \"kubernetes.io/projected/b566b258-1bd9-4623-93bd-ae1931a2bc34-kube-api-access-xdd2p\") pod \"node-ca-6lrl9\" (UID: \"b566b258-1bd9-4623-93bd-ae1931a2bc34\") " pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.243527 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.243506 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ct4q2\" (UniqueName: \"kubernetes.io/projected/a139498f-5c4f-4db0-a95e-1d466b43fc87-kube-api-access-ct4q2\") pod \"ovnkube-node-kgrc5\" (UID: \"a139498f-5c4f-4db0-a95e-1d466b43fc87\") " pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.243850 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.243834 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7f7g\" (UniqueName: \"kubernetes.io/projected/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-kube-api-access-h7f7g\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.244550 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.244527 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/74d0665e-9801-46d4-acaf-54aeb0d3ecd2-cni-binary-copy\") pod \"multus-additional-cni-plugins-cskjl\" (UID: \"74d0665e-9801-46d4-acaf-54aeb0d3ecd2\") " pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.244550 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.244540 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/991e4c70-1167-498d-8923-054a866e840b-agent-certs\") pod \"konnectivity-agent-qnswk\" (UID: \"991e4c70-1167-498d-8923-054a866e840b\") " pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.332441 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332409 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-kubelet\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332461 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-cni-bin\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332495 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysconfig\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332518 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-systemd\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332524 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-cni-bin\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332544 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/395ffea5-2185-4b70-b02f-0698dc276b8f-tmp\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332517 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-kubelet\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332562 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysconfig\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332570 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-multus-certs\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332601 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-systemd\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.332629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332613 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-multus-certs\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332646 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-etc-kubernetes\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332677 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-25nv7\" (UniqueName: \"kubernetes.io/projected/bc88e01c-0268-427e-bd19-2df1ccdb32a0-kube-api-access-25nv7\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332705 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-etc-kubernetes\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332741 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc88e01c-0268-427e-bd19-2df1ccdb32a0-cni-binary-copy\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332758 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-socket-dir-parent\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332778 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-cni-multus\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332796 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysctl-d\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332811 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-sys\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332826 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-cni-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332857 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jllwj\" (UniqueName: \"kubernetes.io/projected/edc72f86-b958-4ee0-aa99-387d9ee23402-kube-api-access-jllwj\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332891 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-var-lib-cni-multus\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332916 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-sys\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332881 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-system-cni-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.332994 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-system-cni-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333049 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-cni-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333051 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysctl-d\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333072 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-socket-dir-parent\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333131 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-k8s-cni-cncf-io\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333182 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-conf-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333207 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-k8s-cni-cncf-io\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333250 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-conf-dir\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333257 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edc72f86-b958-4ee0-aa99-387d9ee23402-host-slash\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333212 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/edc72f86-b958-4ee0-aa99-387d9ee23402-host-slash\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333299 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/edc72f86-b958-4ee0-aa99-387d9ee23402-iptables-alerter-script\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333328 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysctl-conf\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333354 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-host\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333378 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-os-release\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333402 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-netns\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333413 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/bc88e01c-0268-427e-bd19-2df1ccdb32a0-cni-binary-copy\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333424 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-daemon-config\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333565 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-sysctl-conf\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333622 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-os-release\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333679 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-host-run-netns\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333719 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-host\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.333812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333779 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-modprobe-d\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333861 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/bc88e01c-0268-427e-bd19-2df1ccdb32a0-multus-daemon-config\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333879 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/edc72f86-b958-4ee0-aa99-387d9ee23402-iptables-alerter-script\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333890 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-kubernetes\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333918 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-tuned\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333929 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-kubernetes\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333933 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-hostroot\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333962 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-run\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333965 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-hostroot\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333865 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-modprobe-d\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333988 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-run\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.333990 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-var-lib-kubelet\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.334026 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-var-lib-kubelet\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.334027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4f9tq\" (UniqueName: \"kubernetes.io/projected/395ffea5-2185-4b70-b02f-0698dc276b8f-kube-api-access-4f9tq\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.334057 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-cnibin\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.334081 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-lib-modules\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.334233 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/bc88e01c-0268-427e-bd19-2df1ccdb32a0-cnibin\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.334530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.334237 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/395ffea5-2185-4b70-b02f-0698dc276b8f-lib-modules\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.335494 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.335470 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/395ffea5-2185-4b70-b02f-0698dc276b8f-tmp\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.336276 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.336256 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/395ffea5-2185-4b70-b02f-0698dc276b8f-etc-tuned\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.338680 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.338660 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:00.338680 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.338680 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:00.338943 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.338691 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:00.338943 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.338750 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:00.838733734 +0000 UTC m=+3.110663046 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:00.341110 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.341081 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-25nv7\" (UniqueName: \"kubernetes.io/projected/bc88e01c-0268-427e-bd19-2df1ccdb32a0-kube-api-access-25nv7\") pod \"multus-vlh7k\" (UID: \"bc88e01c-0268-427e-bd19-2df1ccdb32a0\") " pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.341241 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.341223 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jllwj\" (UniqueName: \"kubernetes.io/projected/edc72f86-b958-4ee0-aa99-387d9ee23402-kube-api-access-jllwj\") pod \"iptables-alerter-682sp\" (UID: \"edc72f86-b958-4ee0-aa99-387d9ee23402\") " pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.341339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.341321 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4f9tq\" (UniqueName: \"kubernetes.io/projected/395ffea5-2185-4b70-b02f-0698dc276b8f-kube-api-access-4f9tq\") pod \"tuned-8ddqs\" (UID: \"395ffea5-2185-4b70-b02f-0698dc276b8f\") " pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.429463 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.429367 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:00.441330 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.441305 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" Apr 23 13:31:00.450971 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.450933 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-6lrl9" Apr 23 13:31:00.459611 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.459589 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:00.466261 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.466236 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-cskjl" Apr 23 13:31:00.475894 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.475865 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-682sp" Apr 23 13:31:00.484632 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.484605 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" Apr 23 13:31:00.490361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.490338 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-vlh7k" Apr 23 13:31:00.736976 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.736891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:00.737123 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.737066 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:00.737191 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.737147 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:01.737124907 +0000 UTC m=+4.009054226 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:00.938544 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:00.938512 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:00.938710 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.938676 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:00.938710 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.938703 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:00.938797 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.938714 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:00.938797 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:00.938766 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:01.938752263 +0000 UTC m=+4.210681576 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:01.041808 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:01.041738 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod395ffea5_2185_4b70_b02f_0698dc276b8f.slice/crio-099241607ff791978261be7f66f19d71f0a04dd437d82fb32e08bed18f17d48f WatchSource:0}: Error finding container 099241607ff791978261be7f66f19d71f0a04dd437d82fb32e08bed18f17d48f: Status 404 returned error can't find the container with id 099241607ff791978261be7f66f19d71f0a04dd437d82fb32e08bed18f17d48f Apr 23 13:31:01.045949 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:01.045832 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb566b258_1bd9_4623_93bd_ae1931a2bc34.slice/crio-9600a2c344a67f5b4bb30516a38bec4825b798ef7ffc34cf60883fb72073fc4d WatchSource:0}: Error finding container 9600a2c344a67f5b4bb30516a38bec4825b798ef7ffc34cf60883fb72073fc4d: Status 404 returned error can't find the container with id 9600a2c344a67f5b4bb30516a38bec4825b798ef7ffc34cf60883fb72073fc4d Apr 23 13:31:01.047545 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:01.047519 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda48db93d_b4e4_47c7_b0f0_6499dce26319.slice/crio-1a1b00a6ed72f4981c64a3c8f1f753268eeac36f4d291a38fc055dc384218f98 WatchSource:0}: Error finding container 1a1b00a6ed72f4981c64a3c8f1f753268eeac36f4d291a38fc055dc384218f98: Status 404 returned error can't find the container with id 1a1b00a6ed72f4981c64a3c8f1f753268eeac36f4d291a38fc055dc384218f98 Apr 23 13:31:01.047864 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:01.047839 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod74d0665e_9801_46d4_acaf_54aeb0d3ecd2.slice/crio-89ab5595e9bd2bf3bb568fd593d6120a6df6df3e65eff06c43196bb1e019cee7 WatchSource:0}: Error finding container 89ab5595e9bd2bf3bb568fd593d6120a6df6df3e65eff06c43196bb1e019cee7: Status 404 returned error can't find the container with id 89ab5595e9bd2bf3bb568fd593d6120a6df6df3e65eff06c43196bb1e019cee7 Apr 23 13:31:01.049527 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:01.049504 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda139498f_5c4f_4db0_a95e_1d466b43fc87.slice/crio-cdfddf924984b174c9eed7bdf2421f717ee7f93f7e6b667d76a8af48b41e0093 WatchSource:0}: Error finding container cdfddf924984b174c9eed7bdf2421f717ee7f93f7e6b667d76a8af48b41e0093: Status 404 returned error can't find the container with id cdfddf924984b174c9eed7bdf2421f717ee7f93f7e6b667d76a8af48b41e0093 Apr 23 13:31:01.051554 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:01.051530 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod991e4c70_1167_498d_8923_054a866e840b.slice/crio-eb4096f2715cba78c159a1c908c6795dd0e56ba4db0bf20ec834820993433fe8 WatchSource:0}: Error finding container eb4096f2715cba78c159a1c908c6795dd0e56ba4db0bf20ec834820993433fe8: Status 404 returned error can't find the container with id eb4096f2715cba78c159a1c908c6795dd0e56ba4db0bf20ec834820993433fe8 Apr 23 13:31:01.159411 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.159232 2581 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:25:59 +0000 UTC" deadline="2028-01-02 09:47:48.612107476 +0000 UTC" Apr 23 13:31:01.159411 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.159412 2581 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14852h16m47.452701059s" Apr 23 13:31:01.191899 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.191872 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" event={"ID":"377b4825cda22d155cdffc4e7cfa1c2e","Type":"ContainerStarted","Data":"cfcb188d08090f10cd51d24dbdd5cefe7d74b3b372d7281f6436fe4eb5fd050f"} Apr 23 13:31:01.192923 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.192900 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlh7k" event={"ID":"bc88e01c-0268-427e-bd19-2df1ccdb32a0","Type":"ContainerStarted","Data":"a85201eccd3b581a833187b0fc95858a9fc5614550ce031bbd0c5fcdb1f8db0b"} Apr 23 13:31:01.193916 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.193899 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qnswk" event={"ID":"991e4c70-1167-498d-8923-054a866e840b","Type":"ContainerStarted","Data":"eb4096f2715cba78c159a1c908c6795dd0e56ba4db0bf20ec834820993433fe8"} Apr 23 13:31:01.194873 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.194853 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerStarted","Data":"89ab5595e9bd2bf3bb568fd593d6120a6df6df3e65eff06c43196bb1e019cee7"} Apr 23 13:31:01.195785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.195759 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" event={"ID":"a48db93d-b4e4-47c7-b0f0-6499dce26319","Type":"ContainerStarted","Data":"1a1b00a6ed72f4981c64a3c8f1f753268eeac36f4d291a38fc055dc384218f98"} Apr 23 13:31:01.196812 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.196782 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-682sp" event={"ID":"edc72f86-b958-4ee0-aa99-387d9ee23402","Type":"ContainerStarted","Data":"08f00c5fb0f8e3aba90d5e0b4b75bed4859e6fb580a66acc8f7547b6c39b0b49"} Apr 23 13:31:01.198272 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.198248 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"cdfddf924984b174c9eed7bdf2421f717ee7f93f7e6b667d76a8af48b41e0093"} Apr 23 13:31:01.200020 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.200002 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6lrl9" event={"ID":"b566b258-1bd9-4623-93bd-ae1931a2bc34","Type":"ContainerStarted","Data":"9600a2c344a67f5b4bb30516a38bec4825b798ef7ffc34cf60883fb72073fc4d"} Apr 23 13:31:01.200804 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.200783 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" event={"ID":"395ffea5-2185-4b70-b02f-0698dc276b8f","Type":"ContainerStarted","Data":"099241607ff791978261be7f66f19d71f0a04dd437d82fb32e08bed18f17d48f"} Apr 23 13:31:01.203926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.203882 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-133-33.ec2.internal" podStartSLOduration=2.203872293 podStartE2EDuration="2.203872293s" podCreationTimestamp="2026-04-23 13:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:01.203799855 +0000 UTC m=+3.475729188" watchObservedRunningTime="2026-04-23 13:31:01.203872293 +0000 UTC m=+3.475801634" Apr 23 13:31:01.743931 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.743893 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:01.744108 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:01.744073 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:01.744193 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:01.744136 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:03.744117052 +0000 UTC m=+6.016046368 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:01.883743 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.883709 2581 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:01.945596 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:01.945327 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:01.945596 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:01.945497 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:01.945596 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:01.945523 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:01.945596 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:01.945536 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:01.945596 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:01.945593 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:03.945573091 +0000 UTC m=+6.217502407 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:02.184607 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:02.183870 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:02.184607 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:02.184014 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:02.184607 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:02.184437 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:02.184607 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:02.184524 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:02.218818 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:02.218705 2581 generic.go:358] "Generic (PLEG): container finished" podID="d6d6b5027305b2401681b1fe05011e3e" containerID="a2dc2c0f7297ee250ab17b399a20bb1864e0421cbcdc134459800838f770b10c" exitCode=0 Apr 23 13:31:02.218818 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:02.218778 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" event={"ID":"d6d6b5027305b2401681b1fe05011e3e","Type":"ContainerDied","Data":"a2dc2c0f7297ee250ab17b399a20bb1864e0421cbcdc134459800838f770b10c"} Apr 23 13:31:03.239457 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:03.238508 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" event={"ID":"d6d6b5027305b2401681b1fe05011e3e","Type":"ContainerStarted","Data":"14727f13ad5e431dafb39a34b0abd671175a71d64c21c21be9419700e124d623"} Apr 23 13:31:03.759648 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:03.759568 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:03.759833 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:03.759749 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:03.759833 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:03.759824 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:07.759805794 +0000 UTC m=+10.031735119 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:03.961747 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:03.961704 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:03.961923 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:03.961874 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:03.961923 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:03.961896 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:03.961923 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:03.961910 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:03.962030 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:03.961986 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:07.96196503 +0000 UTC m=+10.233894347 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:04.185639 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:04.185610 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:04.185794 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:04.185748 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:04.186169 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:04.186128 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:04.186280 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:04.186242 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:06.183809 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.183323 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:06.183809 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.183338 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:06.183809 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:06.183464 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:06.183809 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:06.183514 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:06.653369 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.653309 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-133-33.ec2.internal" podStartSLOduration=7.653289158 podStartE2EDuration="7.653289158s" podCreationTimestamp="2026-04-23 13:30:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:03.253662622 +0000 UTC m=+5.525591957" watchObservedRunningTime="2026-04-23 13:31:06.653289158 +0000 UTC m=+8.925218493" Apr 23 13:31:06.653801 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.653782 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-272n2"] Apr 23 13:31:06.657054 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.657022 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.657245 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:06.657114 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:06.685925 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.685888 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.686091 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.685993 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b3934add-0113-4201-a17a-eaa6e5cbec42-kubelet-config\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.686091 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.686036 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b3934add-0113-4201-a17a-eaa6e5cbec42-dbus\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.787184 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.787129 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b3934add-0113-4201-a17a-eaa6e5cbec42-dbus\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.787362 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.787222 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.787362 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.787335 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b3934add-0113-4201-a17a-eaa6e5cbec42-kubelet-config\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.787462 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.787429 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/b3934add-0113-4201-a17a-eaa6e5cbec42-kubelet-config\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.787971 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:06.787577 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/b3934add-0113-4201-a17a-eaa6e5cbec42-dbus\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:06.787971 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:06.787679 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:06.787971 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:06.787734 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret podName:b3934add-0113-4201-a17a-eaa6e5cbec42 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:07.287716504 +0000 UTC m=+9.559645832 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret") pod "global-pull-secret-syncer-272n2" (UID: "b3934add-0113-4201-a17a-eaa6e5cbec42") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:07.290769 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:07.290734 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:07.291231 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.290901 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:07.291231 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.290963 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret podName:b3934add-0113-4201-a17a-eaa6e5cbec42 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:08.290945542 +0000 UTC m=+10.562874856 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret") pod "global-pull-secret-syncer-272n2" (UID: "b3934add-0113-4201-a17a-eaa6e5cbec42") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:07.795659 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:07.795056 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:07.795659 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.795233 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:07.795659 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.795301 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:15.795285985 +0000 UTC m=+18.067215299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:07.997263 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:07.996619 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:07.997263 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.996799 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:07.997263 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.996824 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:07.997263 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.996836 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:07.997263 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:07.996885 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:15.9968684 +0000 UTC m=+18.268797713 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:08.184487 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:08.184454 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:08.184680 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:08.184643 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:08.189497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:08.186010 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:08.189497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:08.186028 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:08.189497 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:08.186172 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:08.189497 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:08.186916 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:08.298576 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:08.298532 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:08.298918 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:08.298660 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:08.298918 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:08.298724 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret podName:b3934add-0113-4201-a17a-eaa6e5cbec42 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:10.29870555 +0000 UTC m=+12.570634866 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret") pod "global-pull-secret-syncer-272n2" (UID: "b3934add-0113-4201-a17a-eaa6e5cbec42") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:10.184332 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:10.184300 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:10.184773 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:10.184303 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:10.184773 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:10.184430 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:10.184773 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:10.184303 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:10.184773 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:10.184539 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:10.184773 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:10.184587 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:10.314261 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:10.313638 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:10.314261 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:10.313805 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:10.314261 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:10.313867 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret podName:b3934add-0113-4201-a17a-eaa6e5cbec42 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:14.313849509 +0000 UTC m=+16.585778823 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret") pod "global-pull-secret-syncer-272n2" (UID: "b3934add-0113-4201-a17a-eaa6e5cbec42") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:12.183385 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:12.183351 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:12.183854 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:12.183351 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:12.183854 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:12.183364 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:12.183854 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:12.183471 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:12.183854 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:12.183554 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:12.183854 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:12.183626 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:14.183729 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:14.183694 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:14.184206 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:14.183854 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:14.184278 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:14.184222 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:14.184505 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:14.184325 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:14.184505 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:14.184378 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:14.184505 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:14.184473 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:14.341112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:14.341078 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:14.341287 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:14.341259 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:14.341349 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:14.341322 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret podName:b3934add-0113-4201-a17a-eaa6e5cbec42 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:22.341306076 +0000 UTC m=+24.613235388 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret") pod "global-pull-secret-syncer-272n2" (UID: "b3934add-0113-4201-a17a-eaa6e5cbec42") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:15.488166 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.488127 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/node-resolver-57mv2"] Apr 23 13:31:15.567989 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.567904 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.570809 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.570784 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:15.570950 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.570844 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:15.570950 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.570856 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-btbkw\"" Apr 23 13:31:15.649896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.649860 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-hosts-file\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.650085 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.649916 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9pcsl\" (UniqueName: \"kubernetes.io/projected/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-kube-api-access-9pcsl\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.650085 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.650072 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-tmp-dir\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.750656 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.750624 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-hosts-file\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.750843 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.750679 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9pcsl\" (UniqueName: \"kubernetes.io/projected/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-kube-api-access-9pcsl\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.750843 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.750746 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-hosts-file\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.750843 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.750755 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-tmp-dir\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.751088 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.751067 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-tmp-dir\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.773721 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.773692 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9pcsl\" (UniqueName: \"kubernetes.io/projected/72a4caa3-d1c9-4761-adeb-e08cb9c63ab4-kube-api-access-9pcsl\") pod \"node-resolver-57mv2\" (UID: \"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4\") " pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:15.851890 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.851805 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:15.852035 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:15.851975 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:15.852088 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:15.852036 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:31.852020171 +0000 UTC m=+34.123949483 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:15.877535 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:15.877499 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-57mv2" Apr 23 13:31:16.052914 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:16.052874 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:16.053096 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.053021 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:16.053096 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.053041 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:16.053096 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.053052 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:16.053280 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.053113 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.053094733 +0000 UTC m=+34.325024054 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:16.183589 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:16.183552 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:16.183589 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:16.183587 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:16.183824 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.183661 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:16.183824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:16.183705 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:16.183824 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.183811 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:16.184002 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:16.183925 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:18.184305 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:18.184279 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:18.184638 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:18.184357 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:18.184638 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:18.184371 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:18.184638 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:18.184491 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:18.184638 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:18.184537 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:18.184638 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:18.184611 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:18.243065 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:18.243039 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72a4caa3_d1c9_4761_adeb_e08cb9c63ab4.slice/crio-f28e798867c47cee66053ca51f458687a4c8af8cb76501f6bcf3588a785903f0 WatchSource:0}: Error finding container f28e798867c47cee66053ca51f458687a4c8af8cb76501f6bcf3588a785903f0: Status 404 returned error can't find the container with id f28e798867c47cee66053ca51f458687a4c8af8cb76501f6bcf3588a785903f0 Apr 23 13:31:18.269434 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:18.269411 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-57mv2" event={"ID":"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4","Type":"ContainerStarted","Data":"f28e798867c47cee66053ca51f458687a4c8af8cb76501f6bcf3588a785903f0"} Apr 23 13:31:19.272732 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.272333 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-qnswk" event={"ID":"991e4c70-1167-498d-8923-054a866e840b","Type":"ContainerStarted","Data":"17d031f76d4ea40c740142e4f8d6f294bffefdb9de12bf06b30bf5b269c8f2de"} Apr 23 13:31:19.274171 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.274126 2581 generic.go:358] "Generic (PLEG): container finished" podID="74d0665e-9801-46d4-acaf-54aeb0d3ecd2" containerID="35b8123f3cf5d6692dad526922da0518c1c138908ee41bf031e6515311b17aa1" exitCode=0 Apr 23 13:31:19.274297 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.274188 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerDied","Data":"35b8123f3cf5d6692dad526922da0518c1c138908ee41bf031e6515311b17aa1"} Apr 23 13:31:19.278560 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.278538 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" event={"ID":"a48db93d-b4e4-47c7-b0f0-6499dce26319","Type":"ContainerStarted","Data":"01ab8bf25cfbe39f48490790c21c7726218d64b7fa4f8887e33d73d043cca571"} Apr 23 13:31:19.280760 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.280361 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-57mv2" event={"ID":"72a4caa3-d1c9-4761-adeb-e08cb9c63ab4","Type":"ContainerStarted","Data":"1b41b21f1a7ae7598d56575b4a23885bd14c9b38452e1ea1b5c88b3076a482d4"} Apr 23 13:31:19.285038 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285006 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:31:19.285351 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285327 2581 generic.go:358] "Generic (PLEG): container finished" podID="a139498f-5c4f-4db0-a95e-1d466b43fc87" containerID="59ecd18fdc5c28fff45858b9942f2ad101fe02bce3a6c253bf58bf4388f497ab" exitCode=1 Apr 23 13:31:19.285448 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285394 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"b07c822b9265bf8372a02f0060fc4a8a3d95f1b3261eb9a37158013651c4c057"} Apr 23 13:31:19.285448 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285420 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"9175fc8c1e50d9095c7f97aec0e9da36a2df969432f3cad4a3032a37e439fe1b"} Apr 23 13:31:19.285448 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285434 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"d42ec08d3b9a062fdecaedf3aab32dc66f6af49061d2510822a3d74e956c6f9a"} Apr 23 13:31:19.285448 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285447 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"cf0d62adb4cdbc16919b24636376bd7d3be5e282456460b983f97db5d915bb00"} Apr 23 13:31:19.285626 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285460 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerDied","Data":"59ecd18fdc5c28fff45858b9942f2ad101fe02bce3a6c253bf58bf4388f497ab"} Apr 23 13:31:19.285626 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.285474 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"8fb3daa31a5b95f036a581e8133187c4060a1255dc5bcb4a684634b3e3ceb63c"} Apr 23 13:31:19.287590 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.287567 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-6lrl9" event={"ID":"b566b258-1bd9-4623-93bd-ae1931a2bc34","Type":"ContainerStarted","Data":"51c3d6689fb18036f74212094c9fb5fae7a28541f447e031e9a9cb9d5e2bdcd1"} Apr 23 13:31:19.288875 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.288854 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" event={"ID":"395ffea5-2185-4b70-b02f-0698dc276b8f","Type":"ContainerStarted","Data":"9890231b2e71a305f347400442f710b7a71c862ee117dd7b2347a6fd495f294a"} Apr 23 13:31:19.290410 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.290376 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-vlh7k" event={"ID":"bc88e01c-0268-427e-bd19-2df1ccdb32a0","Type":"ContainerStarted","Data":"e975d94faaf3ddead34684d8885c900e5d912c04d503d81742341633f1c6f82d"} Apr 23 13:31:19.296658 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.296623 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-qnswk" podStartSLOduration=4.16592961 podStartE2EDuration="21.296608256s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.074905438 +0000 UTC m=+3.346834755" lastFinishedPulling="2026-04-23 13:31:18.205584074 +0000 UTC m=+20.477513401" observedRunningTime="2026-04-23 13:31:19.285625549 +0000 UTC m=+21.557554886" watchObservedRunningTime="2026-04-23 13:31:19.296608256 +0000 UTC m=+21.568537590" Apr 23 13:31:19.297382 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.297355 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-57mv2" podStartSLOduration=4.297347145 podStartE2EDuration="4.297347145s" podCreationTimestamp="2026-04-23 13:31:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:19.296697912 +0000 UTC m=+21.568627247" watchObservedRunningTime="2026-04-23 13:31:19.297347145 +0000 UTC m=+21.569276482" Apr 23 13:31:19.323735 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.323680 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-6lrl9" podStartSLOduration=4.165740604 podStartE2EDuration="21.323663309s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.047636527 +0000 UTC m=+3.319565839" lastFinishedPulling="2026-04-23 13:31:18.205559216 +0000 UTC m=+20.477488544" observedRunningTime="2026-04-23 13:31:19.323049003 +0000 UTC m=+21.594978338" watchObservedRunningTime="2026-04-23 13:31:19.323663309 +0000 UTC m=+21.595592640" Apr 23 13:31:19.336111 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.336055 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-8ddqs" podStartSLOduration=4.146446326 podStartE2EDuration="21.336038656s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.043501801 +0000 UTC m=+3.315431115" lastFinishedPulling="2026-04-23 13:31:18.233094128 +0000 UTC m=+20.505023445" observedRunningTime="2026-04-23 13:31:19.335594206 +0000 UTC m=+21.607523539" watchObservedRunningTime="2026-04-23 13:31:19.336038656 +0000 UTC m=+21.607967991" Apr 23 13:31:19.349570 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:19.349530 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-vlh7k" podStartSLOduration=4.177832018 podStartE2EDuration="21.349517385s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.074840189 +0000 UTC m=+3.346769502" lastFinishedPulling="2026-04-23 13:31:18.246525555 +0000 UTC m=+20.518454869" observedRunningTime="2026-04-23 13:31:19.349372975 +0000 UTC m=+21.621302310" watchObservedRunningTime="2026-04-23 13:31:19.349517385 +0000 UTC m=+21.621446718" Apr 23 13:31:20.062215 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.062189 2581 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:31:20.183511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.183435 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:20.183670 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:20.183558 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:20.183670 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.183634 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:20.183779 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:20.183731 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:20.183779 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.183769 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:20.183867 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:20.183840 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:20.194585 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.194498 2581 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:31:20.062210582Z","UUID":"5477b9a8-4610-4f40-a3c0-ed5f4575eeed","Handler":null,"Name":"","Endpoint":""} Apr 23 13:31:20.196204 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.196181 2581 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:31:20.196204 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.196208 2581 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:31:20.299336 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.299303 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" event={"ID":"a48db93d-b4e4-47c7-b0f0-6499dce26319","Type":"ContainerStarted","Data":"c6560743c2699fe9639858684dc3720225a8319d23fbce520745e2118298d4f9"} Apr 23 13:31:20.300708 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:20.300648 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-682sp" event={"ID":"edc72f86-b958-4ee0-aa99-387d9ee23402","Type":"ContainerStarted","Data":"051bc1e1fcba90f1812a47e90170865d1aaf1a979f985ba17e366baa0d23ee06"} Apr 23 13:31:21.140353 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.140319 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:21.141045 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.141022 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:21.156864 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.156817 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-682sp" podStartSLOduration=6.026078415 podStartE2EDuration="23.156799267s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.074849279 +0000 UTC m=+3.346778595" lastFinishedPulling="2026-04-23 13:31:18.205570134 +0000 UTC m=+20.477499447" observedRunningTime="2026-04-23 13:31:20.325697593 +0000 UTC m=+22.597626928" watchObservedRunningTime="2026-04-23 13:31:21.156799267 +0000 UTC m=+23.428728600" Apr 23 13:31:21.306567 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.306398 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:31:21.307145 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.307116 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"4f7a4ad5d19ef627ddc65bf965f4ace57ec86779df8b396773c99ecd59fe2d37"} Apr 23 13:31:21.471992 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.471940 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:21.472600 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:21.472584 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-qnswk" Apr 23 13:31:22.183846 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:22.183818 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:22.184031 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:22.183813 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:22.184031 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:22.183969 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:22.184031 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:22.184024 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:22.184216 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:22.184067 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:22.184216 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:22.184120 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:22.311277 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:22.311246 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" event={"ID":"a48db93d-b4e4-47c7-b0f0-6499dce26319","Type":"ContainerStarted","Data":"a31c4c8fb2dd0eb9511dcd7f796a728a0e3dae9b61bca9e4ff1944ab5e70ca69"} Apr 23 13:31:22.327467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:22.327413 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-84j8r" podStartSLOduration=3.901815048 podStartE2EDuration="24.327394924s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.050191596 +0000 UTC m=+3.322120907" lastFinishedPulling="2026-04-23 13:31:21.475771457 +0000 UTC m=+23.747700783" observedRunningTime="2026-04-23 13:31:22.327116527 +0000 UTC m=+24.599045861" watchObservedRunningTime="2026-04-23 13:31:22.327394924 +0000 UTC m=+24.599324258" Apr 23 13:31:22.397643 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:22.397606 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:22.397805 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:22.397756 2581 secret.go:189] Couldn't get secret kube-system/original-pull-secret: object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:22.397862 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:22.397830 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret podName:b3934add-0113-4201-a17a-eaa6e5cbec42 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:38.39781049 +0000 UTC m=+40.669739818 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "original-pull-secret" (UniqueName: "kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret") pod "global-pull-secret-syncer-272n2" (UID: "b3934add-0113-4201-a17a-eaa6e5cbec42") : object "kube-system"/"original-pull-secret" not registered Apr 23 13:31:24.184296 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.184104 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:24.184850 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.184144 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:24.184850 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.184190 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:24.184850 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:24.184457 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:24.184850 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:24.184357 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:24.184850 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:24.184561 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:24.318640 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.318616 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:31:24.319005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.318979 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"7b444ef03ea3728a166ef7291352bfa6e85d45bb96777d5f7a3dc49bbc76be49"} Apr 23 13:31:24.319290 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.319267 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:24.319371 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.319301 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:24.319454 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.319439 2581 scope.go:117] "RemoveContainer" containerID="59ecd18fdc5c28fff45858b9942f2ad101fe02bce3a6c253bf58bf4388f497ab" Apr 23 13:31:24.320697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.320675 2581 generic.go:358] "Generic (PLEG): container finished" podID="74d0665e-9801-46d4-acaf-54aeb0d3ecd2" containerID="1adab3452a45adb6a7e4635f6d4202b21456b7f3e62575971b2c64ec5385e171" exitCode=0 Apr 23 13:31:24.320755 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.320709 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerDied","Data":"1adab3452a45adb6a7e4635f6d4202b21456b7f3e62575971b2c64ec5385e171"} Apr 23 13:31:24.335412 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:24.335390 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:25.324669 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.324634 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerStarted","Data":"57a3f97a27adaa9beff77a674550ff53a14dba23c9ca89d3d1c488a2d910af90"} Apr 23 13:31:25.328721 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.328697 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:31:25.329090 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.329048 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" event={"ID":"a139498f-5c4f-4db0-a95e-1d466b43fc87","Type":"ContainerStarted","Data":"27db0f1c3d7056667b3b1b240401c83d73f85171392e1fcd346093b424342e81"} Apr 23 13:31:25.329972 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.329952 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:25.345096 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.345068 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:31:25.376074 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.376020 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" podStartSLOduration=10.168232979 podStartE2EDuration="27.376005373s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.075027371 +0000 UTC m=+3.346956686" lastFinishedPulling="2026-04-23 13:31:18.282799766 +0000 UTC m=+20.554729080" observedRunningTime="2026-04-23 13:31:25.374497231 +0000 UTC m=+27.646426566" watchObservedRunningTime="2026-04-23 13:31:25.376005373 +0000 UTC m=+27.647934707" Apr 23 13:31:25.534487 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.534409 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-272n2"] Apr 23 13:31:25.534612 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.534557 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:25.534676 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:25.534657 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:25.537427 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.537399 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fpksp"] Apr 23 13:31:25.537558 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.537521 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:25.537631 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:25.537614 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:25.537997 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.537977 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-749xq"] Apr 23 13:31:25.538096 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:25.538074 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:25.538203 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:25.538184 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:26.332771 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:26.332734 2581 generic.go:358] "Generic (PLEG): container finished" podID="74d0665e-9801-46d4-acaf-54aeb0d3ecd2" containerID="57a3f97a27adaa9beff77a674550ff53a14dba23c9ca89d3d1c488a2d910af90" exitCode=0 Apr 23 13:31:26.333331 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:26.332795 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerDied","Data":"57a3f97a27adaa9beff77a674550ff53a14dba23c9ca89d3d1c488a2d910af90"} Apr 23 13:31:27.183378 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:27.183345 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:27.183527 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:27.183343 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:27.183527 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:27.183449 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:27.183613 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:27.183343 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:27.183613 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:27.183523 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:27.183613 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:27.183605 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:27.337143 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:27.336965 2581 generic.go:358] "Generic (PLEG): container finished" podID="74d0665e-9801-46d4-acaf-54aeb0d3ecd2" containerID="6c25ef39e90479c55406b63de319b64587c7eeb22f835984067fc1935e244430" exitCode=0 Apr 23 13:31:27.337488 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:27.337044 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerDied","Data":"6c25ef39e90479c55406b63de319b64587c7eeb22f835984067fc1935e244430"} Apr 23 13:31:29.183935 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:29.183890 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:29.184554 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:29.184013 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:29.184554 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:29.184022 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:29.184554 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:29.184106 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:29.184554 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:29.184117 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:29.184554 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:29.184310 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:31.183592 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.183561 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:31.184071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.183606 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:31.184071 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.183692 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="kube-system/global-pull-secret-syncer-272n2" podUID="b3934add-0113-4201-a17a-eaa6e5cbec42" Apr 23 13:31:31.184071 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.183754 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-749xq" podUID="9d83eb43-b297-4115-9710-b7d78d042392" Apr 23 13:31:31.184071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.183791 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:31.184071 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.183900 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:31:31.603215 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.603184 2581 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-133-33.ec2.internal" event="NodeReady" Apr 23 13:31:31.603399 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.603341 2581 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:31:31.670089 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.670015 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-68c655548f-7bjs7"] Apr 23 13:31:31.674098 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.674069 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-48vsk"] Apr 23 13:31:31.674275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.674260 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.677024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.677001 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:31:31.677024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.677012 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:31:31.677207 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.677043 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fxpfk\"" Apr 23 13:31:31.677207 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.677193 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:31:31.677471 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.677453 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:31.680429 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.680405 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-tzxth"] Apr 23 13:31:31.683397 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.682726 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-console\"/\"networking-console-plugin\"" Apr 23 13:31:31.683397 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.683063 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"networking-console-plugin-cert\"" Apr 23 13:31:31.684602 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.684005 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xsbtc\"" Apr 23 13:31:31.685745 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.685724 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-785s4"] Apr 23 13:31:31.689308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.688812 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:31:31.689449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.689273 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:31.692046 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.692024 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:31:31.692382 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.692364 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:31:31.692470 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.692385 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:31:31.692916 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.692897 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pj4z2\"" Apr 23 13:31:31.693367 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.693351 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-48vsk"] Apr 23 13:31:31.693446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.693374 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-68c655548f-7bjs7"] Apr 23 13:31:31.693446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.693384 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tzxth"] Apr 23 13:31:31.693519 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.693479 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.695729 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.695711 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:31:31.695835 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.695735 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:31:31.695835 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.695711 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gtr2j\"" Apr 23 13:31:31.704547 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.704526 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-785s4"] Apr 23 13:31:31.767595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767558 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-bound-sa-token\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.767595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767598 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-tmp-dir\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.767822 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767655 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-trusted-ca\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.767822 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767727 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dsbfj\" (UniqueName: \"kubernetes.io/projected/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-kube-api-access-dsbfj\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.767822 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767780 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-installation-pull-secrets\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.767822 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:31.768005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767873 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26d41729-e489-4e6c-997e-ca85d3402bba-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:31.768005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767907 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-image-registry-private-configuration\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.768005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767952 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.768005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767974 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.768005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.767990 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-certificates\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.768275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.768042 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xqzpw\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-kube-api-access-xqzpw\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.768275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.768083 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nj7wf\" (UniqueName: \"kubernetes.io/projected/0d8b2cf1-4023-4221-a610-b0935e9dd17c-kube-api-access-nj7wf\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:31.768275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.768131 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-ca-trust-extracted\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.768275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.768189 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:31.768275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.768221 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-config-volume\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.868901 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.868867 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:31.868901 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.868907 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-config-volume\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.869113 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.868925 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-bound-sa-token\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869113 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869015 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:31:31.869113 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869057 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-tmp-dir\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.869113 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869085 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.369069137 +0000 UTC m=+34.640998452 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-trusted-ca\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dsbfj\" (UniqueName: \"kubernetes.io/projected/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-kube-api-access-dsbfj\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869202 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869222 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-installation-pull-secrets\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869241 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869287 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26d41729-e489-4e6c-997e-ca85d3402bba-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869313 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-image-registry-private-configuration\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869339 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869343 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869357 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-tmp-dir\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869370 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869380 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869385 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.869367942 +0000 UTC m=+66.141297257 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869466 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869483 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-certificates\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869505 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.369492727 +0000 UTC m=+34.641422040 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869534 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xqzpw\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-kube-api-access-xqzpw\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-config-volume\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869567 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nj7wf\" (UniqueName: \"kubernetes.io/projected/0d8b2cf1-4023-4221-a610-b0935e9dd17c-kube-api-access-nj7wf\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869605 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-ca-trust-extracted\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869828 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869891 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.369877464 +0000 UTC m=+34.641806780 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.869916 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-ca-trust-extracted\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869924 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869960 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:31:31.869993 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:31.869999 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:32.369984827 +0000 UTC m=+34.641914166 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:31:31.870521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.870089 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-certificates\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.870521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.870360 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-trusted-ca\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.870521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.870482 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/26d41729-e489-4e6c-997e-ca85d3402bba-nginx-conf\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:31.874129 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.874103 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-image-registry-private-configuration\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.874252 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.874136 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-installation-pull-secrets\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.878125 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.878097 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xqzpw\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-kube-api-access-xqzpw\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.878441 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.878407 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-bound-sa-token\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:31.878552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.878491 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dsbfj\" (UniqueName: \"kubernetes.io/projected/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-kube-api-access-dsbfj\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:31.878552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:31.878501 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nj7wf\" (UniqueName: \"kubernetes.io/projected/0d8b2cf1-4023-4221-a610-b0935e9dd17c-kube-api-access-nj7wf\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:32.071604 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:32.071550 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:32.071797 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.071738 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:32.071797 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.071764 2581 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:32.071797 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.071777 2581 projected.go:194] Error preparing data for projected volume kube-api-access-ktx7q for pod openshift-network-diagnostics/network-check-target-749xq: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:32.071947 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.071851 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q podName:9d83eb43-b297-4115-9710-b7d78d042392 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:04.071831387 +0000 UTC m=+66.343760720 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-ktx7q" (UniqueName: "kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q") pod "network-check-target-749xq" (UID: "9d83eb43-b297-4115-9710-b7d78d042392") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:32.374194 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:32.374138 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:32.374237 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:32.374268 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374287 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:32.374313 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374365 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:33.374345389 +0000 UTC m=+35.646274724 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374365 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374405 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:33.374394982 +0000 UTC m=+35.646324298 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374416 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374416 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374437 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374468 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:31:33.374452582 +0000 UTC m=+35.646381936 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:31:32.374733 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:32.374490 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:33.374474122 +0000 UTC m=+35.646403462 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:31:33.183549 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.183507 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:31:33.183715 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.183563 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:31:33.183715 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.183527 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:33.186700 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.186674 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:31:33.186833 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.186674 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:31:33.187590 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.187562 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:31:33.187708 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.187566 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:31:33.187764 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.187732 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lbjnn\"" Apr 23 13:31:33.187764 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.187737 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5zcnp\"" Apr 23 13:31:33.382816 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.382777 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.382826 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.382847 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:33.382883 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.382912 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.382918 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.382963 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:35.382948227 +0000 UTC m=+37.654877539 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.382976 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:35.382969991 +0000 UTC m=+37.654899304 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.382978 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.383005 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.383019 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.383022 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:31:35.383007285 +0000 UTC m=+37.654936598 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:31:33.383198 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:33.383074 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:35.38306625 +0000 UTC m=+37.654995562 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:31:34.353782 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:34.353577 2581 generic.go:358] "Generic (PLEG): container finished" podID="74d0665e-9801-46d4-acaf-54aeb0d3ecd2" containerID="c53d633a85a64b3be45e07f3db858630a05ce90115a4c7e2756dd34b3bcaac90" exitCode=0 Apr 23 13:31:34.353782 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:34.353669 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerDied","Data":"c53d633a85a64b3be45e07f3db858630a05ce90115a4c7e2756dd34b3bcaac90"} Apr 23 13:31:35.357613 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:35.357582 2581 generic.go:358] "Generic (PLEG): container finished" podID="74d0665e-9801-46d4-acaf-54aeb0d3ecd2" containerID="2fb75654c8c132e996a0d4b3e7521c223ffcdd9c9c6ef071efdb7e7383b14c32" exitCode=0 Apr 23 13:31:35.357958 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:35.357642 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerDied","Data":"2fb75654c8c132e996a0d4b3e7521c223ffcdd9c9c6ef071efdb7e7383b14c32"} Apr 23 13:31:35.397621 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:35.397594 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:35.397762 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:35.397700 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:35.397762 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397734 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:31:35.397762 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:35.397754 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:35.397914 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397793 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.397776637 +0000 UTC m=+41.669705949 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:31:35.397914 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397811 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:35.397914 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:35.397818 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:35.397914 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397852 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:35.397914 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397861 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.39784755 +0000 UTC m=+41.669776862 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:31:35.397914 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397895 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.397879675 +0000 UTC m=+41.669808988 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:35.398201 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397925 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:31:35.398201 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397937 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:31:35.398201 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:35.397969 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:39.397958783 +0000 UTC m=+41.669888095 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:31:36.362319 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:36.362286 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-cskjl" event={"ID":"74d0665e-9801-46d4-acaf-54aeb0d3ecd2","Type":"ContainerStarted","Data":"3130b4a2e59c3648f4d8b3442013a831d8fd4ac53b203645bc8d1655c696d5e7"} Apr 23 13:31:36.385662 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:36.385617 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-cskjl" podStartSLOduration=5.731915204 podStartE2EDuration="38.385600699s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:31:01.051391688 +0000 UTC m=+3.323321018" lastFinishedPulling="2026-04-23 13:31:33.7050772 +0000 UTC m=+35.977006513" observedRunningTime="2026-04-23 13:31:36.384323828 +0000 UTC m=+38.656253163" watchObservedRunningTime="2026-04-23 13:31:36.385600699 +0000 UTC m=+38.657530034" Apr 23 13:31:38.421829 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:38.421791 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:38.425391 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:38.425369 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/b3934add-0113-4201-a17a-eaa6e5cbec42-original-pull-secret\") pod \"global-pull-secret-syncer-272n2\" (UID: \"b3934add-0113-4201-a17a-eaa6e5cbec42\") " pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:38.602215 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:38.602179 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-272n2" Apr 23 13:31:38.739289 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:38.739256 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-272n2"] Apr 23 13:31:38.742322 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:31:38.742297 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3934add_0113_4201_a17a_eaa6e5cbec42.slice/crio-f12cf0d1a797aa5d69e4a81b6818fdd4dcb4506f03368e6d051707a420b017bf WatchSource:0}: Error finding container f12cf0d1a797aa5d69e4a81b6818fdd4dcb4506f03368e6d051707a420b017bf: Status 404 returned error can't find the container with id f12cf0d1a797aa5d69e4a81b6818fdd4dcb4506f03368e6d051707a420b017bf Apr 23 13:31:39.369005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:39.368969 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-272n2" event={"ID":"b3934add-0113-4201-a17a-eaa6e5cbec42","Type":"ContainerStarted","Data":"f12cf0d1a797aa5d69e4a81b6818fdd4dcb4506f03368e6d051707a420b017bf"} Apr 23 13:31:39.430317 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:39.430279 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:39.430359 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:39.430389 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430422 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:39.430431 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430492 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:47.430472217 +0000 UTC m=+49.702401543 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430532 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430583 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:31:47.430566733 +0000 UTC m=+49.702496050 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430644 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430676 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:47.430666112 +0000 UTC m=+49.702595428 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430727 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:31:39.430735 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430734 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:31:39.431130 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:39.430755 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:31:47.430748173 +0000 UTC m=+49.702677485 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:31:44.379225 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:44.379190 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-272n2" event={"ID":"b3934add-0113-4201-a17a-eaa6e5cbec42","Type":"ContainerStarted","Data":"90b1e14fda4932483d8303a1100f2391c2dc988a76bf1a13ca70f734d5a5df26"} Apr 23 13:31:44.394104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:44.394063 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-272n2" podStartSLOduration=33.67853748 podStartE2EDuration="38.39405046s" podCreationTimestamp="2026-04-23 13:31:06 +0000 UTC" firstStartedPulling="2026-04-23 13:31:38.744110335 +0000 UTC m=+41.016039651" lastFinishedPulling="2026-04-23 13:31:43.459623319 +0000 UTC m=+45.731552631" observedRunningTime="2026-04-23 13:31:44.393685235 +0000 UTC m=+46.665614568" watchObservedRunningTime="2026-04-23 13:31:44.39405046 +0000 UTC m=+46.665979785" Apr 23 13:31:47.493634 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:47.493605 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:31:47.493634 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:47.493640 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:47.493667 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493764 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493804 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493818 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493812 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.493798333 +0000 UTC m=+65.765727645 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:47.493760 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493767 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493868 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.493851897 +0000 UTC m=+65.765781209 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493869 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493886 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.493876628 +0000 UTC m=+65.765805939 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:31:47.494029 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:31:47.493908 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:03.493898661 +0000 UTC m=+65.765827973 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:31:57.348061 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:31:57.348035 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-kgrc5" Apr 23 13:32:03.516165 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:03.516121 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:03.516180 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:03.516198 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:03.516223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516256 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516296 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516314 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516320 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:32:35.51630593 +0000 UTC m=+97.788235243 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516361 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516373 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:32:35.516360203 +0000 UTC m=+97.788289515 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516381 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516385 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:35.516378368 +0000 UTC m=+97.788307680 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:32:03.516654 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.516445 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:32:35.516428423 +0000 UTC m=+97.788357739 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:32:03.919805 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:03.919771 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:32:03.922392 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:03.922371 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:32:03.930602 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.930584 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:03.930678 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:03.930638 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:07.930621886 +0000 UTC m=+130.202551198 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : secret "metrics-daemon-secret" not found Apr 23 13:32:04.121580 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.121548 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:32:04.125137 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.125117 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:32:04.135095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.135071 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:32:04.145863 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.145839 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ktx7q\" (UniqueName: \"kubernetes.io/projected/9d83eb43-b297-4115-9710-b7d78d042392-kube-api-access-ktx7q\") pod \"network-check-target-749xq\" (UID: \"9d83eb43-b297-4115-9710-b7d78d042392\") " pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:32:04.398409 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.398382 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5zcnp\"" Apr 23 13:32:04.406630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.406603 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:32:04.523261 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:04.523232 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-749xq"] Apr 23 13:32:04.526411 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:32:04.526373 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9d83eb43_b297_4115_9710_b7d78d042392.slice/crio-a70c8026f7583d7c1bd472758c8b5d958a2d0c9a2932195937588a0fc5fa925d WatchSource:0}: Error finding container a70c8026f7583d7c1bd472758c8b5d958a2d0c9a2932195937588a0fc5fa925d: Status 404 returned error can't find the container with id a70c8026f7583d7c1bd472758c8b5d958a2d0c9a2932195937588a0fc5fa925d Apr 23 13:32:05.421143 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:05.421102 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-749xq" event={"ID":"9d83eb43-b297-4115-9710-b7d78d042392","Type":"ContainerStarted","Data":"a70c8026f7583d7c1bd472758c8b5d958a2d0c9a2932195937588a0fc5fa925d"} Apr 23 13:32:07.426169 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:07.426128 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-749xq" event={"ID":"9d83eb43-b297-4115-9710-b7d78d042392","Type":"ContainerStarted","Data":"414e2cf0c030b4c0ac807a7b1e1ba2715a0cc15bed06ce2ef253db61f66f06bd"} Apr 23 13:32:07.426549 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:07.426294 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:32:07.444415 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:07.444342 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-749xq" podStartSLOduration=66.782627456 podStartE2EDuration="1m9.444330119s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:32:04.528299822 +0000 UTC m=+66.800229137" lastFinishedPulling="2026-04-23 13:32:07.190002473 +0000 UTC m=+69.461931800" observedRunningTime="2026-04-23 13:32:07.44280358 +0000 UTC m=+69.714732914" watchObservedRunningTime="2026-04-23 13:32:07.444330119 +0000 UTC m=+69.716259453" Apr 23 13:32:35.543722 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:35.543686 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:35.543743 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:35.543768 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:35.543787 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543831 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543890 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543904 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543904 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543908 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543893 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:33:39.54387622 +0000 UTC m=+161.815805532 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.543972 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:33:39.543955812 +0000 UTC m=+161.815885138 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.544010 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:33:39.54399917 +0000 UTC m=+161.815928485 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:32:35.544227 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:32:35.544030 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:39.544022803 +0000 UTC m=+161.815952115 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:32:38.430302 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:32:38.430269 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-749xq" Apr 23 13:33:07.976892 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:07.976858 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:33:07.977382 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:07.976981 2581 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:07.977382 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:07.977035 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs podName:c3fdc691-5051-4c4c-8360-ed987a28f315 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:09.977021379 +0000 UTC m=+252.248950691 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs") pod "network-metrics-daemon-fpksp" (UID: "c3fdc691-5051-4c4c-8360-ed987a28f315") : secret "metrics-daemon-secret" not found Apr 23 13:33:28.734037 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.734003 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gzhr9"] Apr 23 13:33:28.740007 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.739969 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.742143 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.742092 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"service-ca-bundle\"" Apr 23 13:33:28.742345 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.742331 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"operator-dockercfg-xpd6v\"" Apr 23 13:33:28.742425 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.742346 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:33:28.743092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.743074 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:33:28.743220 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.743096 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"openshift-insights-serving-cert\"" Apr 23 13:33:28.747295 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.747132 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"trusted-ca-bundle\"" Apr 23 13:33:28.747970 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.747952 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gzhr9"] Apr 23 13:33:28.813464 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.813430 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-tmp\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.813464 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.813464 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-2fhhl\" (UniqueName: \"kubernetes.io/projected/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-kube-api-access-2fhhl\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.813692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.813530 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-snapshots\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.813692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.813554 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-serving-cert\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.813692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.813591 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.813692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.813613 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-service-ca-bundle\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914257 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-snapshots\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914305 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-serving-cert\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914377 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914406 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-service-ca-bundle\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914474 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-tmp\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914547 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-2fhhl\" (UniqueName: \"kubernetes.io/projected/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-kube-api-access-2fhhl\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.914940 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914921 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-tmp\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.915025 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.914986 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"snapshots\" (UniqueName: \"kubernetes.io/empty-dir/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-snapshots\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.915670 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.915650 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-service-ca-bundle\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.916119 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.916103 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-trusted-ca-bundle\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.917049 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.917029 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-serving-cert\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:28.922886 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:28.922866 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-2fhhl\" (UniqueName: \"kubernetes.io/projected/d18e4b4d-4c24-4614-b7e1-e4d9ff536c14-kube-api-access-2fhhl\") pod \"insights-operator-585dfdc468-gzhr9\" (UID: \"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14\") " pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:29.049570 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:29.049494 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" Apr 23 13:33:29.158676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:29.158648 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-operator-585dfdc468-gzhr9"] Apr 23 13:33:29.162353 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:33:29.162328 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd18e4b4d_4c24_4614_b7e1_e4d9ff536c14.slice/crio-82130c593f31868b5e41802ab2965ffbfacd9179d3ae3a0cbcd41a15cc715efb WatchSource:0}: Error finding container 82130c593f31868b5e41802ab2965ffbfacd9179d3ae3a0cbcd41a15cc715efb: Status 404 returned error can't find the container with id 82130c593f31868b5e41802ab2965ffbfacd9179d3ae3a0cbcd41a15cc715efb Apr 23 13:33:29.582286 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:29.582251 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" event={"ID":"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14","Type":"ContainerStarted","Data":"82130c593f31868b5e41802ab2965ffbfacd9179d3ae3a0cbcd41a15cc715efb"} Apr 23 13:33:31.588259 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:31.588218 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" event={"ID":"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14","Type":"ContainerStarted","Data":"d29488e5c6e830366dde982ed44c81576ddd03cb7f66acf9ff288b5466170343"} Apr 23 13:33:31.604652 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:31.604597 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" podStartSLOduration=2.006908626 podStartE2EDuration="3.604583426s" podCreationTimestamp="2026-04-23 13:33:28 +0000 UTC" firstStartedPulling="2026-04-23 13:33:29.164052091 +0000 UTC m=+151.435981403" lastFinishedPulling="2026-04-23 13:33:30.761726869 +0000 UTC m=+153.033656203" observedRunningTime="2026-04-23 13:33:31.603332349 +0000 UTC m=+153.875261684" watchObservedRunningTime="2026-04-23 13:33:31.604583426 +0000 UTC m=+153.876512760" Apr 23 13:33:33.670352 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.670321 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc"] Apr 23 13:33:33.673600 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.673585 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.676281 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.676261 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"serving-cert\"" Apr 23 13:33:33.676281 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.676273 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"config\"" Apr 23 13:33:33.677197 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.677176 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-storage-version-migrator-operator-dockercfg-xc2dj\"" Apr 23 13:33:33.677197 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.677192 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:33.677392 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.677213 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:33.680570 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.680553 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc"] Apr 23 13:33:33.750545 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.750520 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.750692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.750569 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l59hl\" (UniqueName: \"kubernetes.io/projected/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-kube-api-access-l59hl\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.750692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.750611 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.781696 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.781674 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-57mv2_72a4caa3-d1c9-4761-adeb-e08cb9c63ab4/dns-node-resolver/0.log" Apr 23 13:33:33.851225 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.851194 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.851376 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.851283 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.851376 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.851326 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l59hl\" (UniqueName: \"kubernetes.io/projected/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-kube-api-access-l59hl\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.851881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.851857 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-config\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.853442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.853419 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-serving-cert\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.859932 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.859910 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l59hl\" (UniqueName: \"kubernetes.io/projected/e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc-kube-api-access-l59hl\") pod \"kube-storage-version-migrator-operator-6769c5d45-f7btc\" (UID: \"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc\") " pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:33.982374 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:33.982288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" Apr 23 13:33:34.099183 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:34.099135 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc"] Apr 23 13:33:34.102395 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:33:34.102366 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0a1eb62_d4d4_431f_95e0_db89b3e5d7cc.slice/crio-615b94a0149bb706a72766ccd32b17941723388d0ba2f6cae7bad086f751c579 WatchSource:0}: Error finding container 615b94a0149bb706a72766ccd32b17941723388d0ba2f6cae7bad086f751c579: Status 404 returned error can't find the container with id 615b94a0149bb706a72766ccd32b17941723388d0ba2f6cae7bad086f751c579 Apr 23 13:33:34.597975 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:34.597927 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" event={"ID":"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc","Type":"ContainerStarted","Data":"615b94a0149bb706a72766ccd32b17941723388d0ba2f6cae7bad086f751c579"} Apr 23 13:33:34.693591 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:34.693550 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" podUID="4965cc34-c960-48a4-926f-7fa8eb0f7e5d" Apr 23 13:33:34.704408 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:34.704372 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" podUID="26d41729-e489-4e6c-997e-ca85d3402bba" Apr 23 13:33:34.711568 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:34.711530 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tzxth" podUID="0d8b2cf1-4023-4221-a610-b0935e9dd17c" Apr 23 13:33:34.717711 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:34.717683 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-785s4" podUID="0b4ad106-4e39-4a55-96b5-d5f06ffb38f4" Apr 23 13:33:34.981534 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:34.981509 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6lrl9_b566b258-1bd9-4623-93bd-ae1931a2bc34/node-ca/0.log" Apr 23 13:33:35.600119 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.600067 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-785s4" Apr 23 13:33:35.600119 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.600108 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:33:35.600374 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.600238 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:33:35.600422 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.600393 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:33:35.694084 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.694056 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj"] Apr 23 13:33:35.698180 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.698142 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.700641 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.700617 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:35.700753 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.700618 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:35.700753 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.700625 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 23 13:33:35.701598 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.701572 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-jxznz\"" Apr 23 13:33:35.701755 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.701574 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 23 13:33:35.706354 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.705942 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj"] Apr 23 13:33:35.767368 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.767337 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mftpk\" (UniqueName: \"kubernetes.io/projected/09175df8-e13c-48b2-84da-c1e5469f683f-kube-api-access-mftpk\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.767531 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.767388 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09175df8-e13c-48b2-84da-c1e5469f683f-config\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.767531 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.767509 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09175df8-e13c-48b2-84da-c1e5469f683f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.868421 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.868380 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mftpk\" (UniqueName: \"kubernetes.io/projected/09175df8-e13c-48b2-84da-c1e5469f683f-kube-api-access-mftpk\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.868593 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.868437 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09175df8-e13c-48b2-84da-c1e5469f683f-config\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.868593 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.868487 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09175df8-e13c-48b2-84da-c1e5469f683f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.869146 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.869120 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/09175df8-e13c-48b2-84da-c1e5469f683f-config\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.871203 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.871179 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/09175df8-e13c-48b2-84da-c1e5469f683f-serving-cert\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:35.877526 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:35.877500 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mftpk\" (UniqueName: \"kubernetes.io/projected/09175df8-e13c-48b2-84da-c1e5469f683f-kube-api-access-mftpk\") pod \"service-ca-operator-d6fc45fc5-qhlmj\" (UID: \"09175df8-e13c-48b2-84da-c1e5469f683f\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:36.008969 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.008926 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" Apr 23 13:33:36.126645 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.126573 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj"] Apr 23 13:33:36.131028 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:33:36.130995 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod09175df8_e13c_48b2_84da_c1e5469f683f.slice/crio-8c3b8983e85046563613e622f79b9c8ecd54abc13263d2497e43e38932a1bb6c WatchSource:0}: Error finding container 8c3b8983e85046563613e622f79b9c8ecd54abc13263d2497e43e38932a1bb6c: Status 404 returned error can't find the container with id 8c3b8983e85046563613e622f79b9c8ecd54abc13263d2497e43e38932a1bb6c Apr 23 13:33:36.212279 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:36.212229 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-fpksp" podUID="c3fdc691-5051-4c4c-8360-ed987a28f315" Apr 23 13:33:36.605568 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.605534 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" event={"ID":"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc","Type":"ContainerStarted","Data":"a0a02cee6a76cfd60e95b8221bec54113a1345e68c78d904f8b165eaed4d0121"} Apr 23 13:33:36.610329 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.610304 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" event={"ID":"09175df8-e13c-48b2-84da-c1e5469f683f","Type":"ContainerStarted","Data":"8c3b8983e85046563613e622f79b9c8ecd54abc13263d2497e43e38932a1bb6c"} Apr 23 13:33:36.622736 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.622709 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w"] Apr 23 13:33:36.624992 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.624954 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" podStartSLOduration=1.6766334569999999 podStartE2EDuration="3.624942443s" podCreationTimestamp="2026-04-23 13:33:33 +0000 UTC" firstStartedPulling="2026-04-23 13:33:34.104632551 +0000 UTC m=+156.376561863" lastFinishedPulling="2026-04-23 13:33:36.052941527 +0000 UTC m=+158.324870849" observedRunningTime="2026-04-23 13:33:36.623917748 +0000 UTC m=+158.895847081" watchObservedRunningTime="2026-04-23 13:33:36.624942443 +0000 UTC m=+158.896871776" Apr 23 13:33:36.625563 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.625546 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" Apr 23 13:33:36.627906 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.627884 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"network-diagnostics-dockercfg-qkhcd\"" Apr 23 13:33:36.636413 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.636388 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w"] Apr 23 13:33:36.674751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.674717 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-d6wml\" (UniqueName: \"kubernetes.io/projected/7d135553-0468-4a2d-acf5-076ed7430660-kube-api-access-d6wml\") pod \"network-check-source-8894fc9bd-7bc2w\" (UID: \"7d135553-0468-4a2d-acf5-076ed7430660\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" Apr 23 13:33:36.776043 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.775980 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-d6wml\" (UniqueName: \"kubernetes.io/projected/7d135553-0468-4a2d-acf5-076ed7430660-kube-api-access-d6wml\") pod \"network-check-source-8894fc9bd-7bc2w\" (UID: \"7d135553-0468-4a2d-acf5-076ed7430660\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" Apr 23 13:33:36.786065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.786037 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-d6wml\" (UniqueName: \"kubernetes.io/projected/7d135553-0468-4a2d-acf5-076ed7430660-kube-api-access-d6wml\") pod \"network-check-source-8894fc9bd-7bc2w\" (UID: \"7d135553-0468-4a2d-acf5-076ed7430660\") " pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" Apr 23 13:33:36.939404 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:36.939323 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" Apr 23 13:33:37.074270 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.074235 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w"] Apr 23 13:33:37.079445 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:33:37.079414 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7d135553_0468_4a2d_acf5_076ed7430660.slice/crio-7d1c0bfc353ed81e45a9ff8427a1bc68c3603e49fd28b172181db5b3617ac353 WatchSource:0}: Error finding container 7d1c0bfc353ed81e45a9ff8427a1bc68c3603e49fd28b172181db5b3617ac353: Status 404 returned error can't find the container with id 7d1c0bfc353ed81e45a9ff8427a1bc68c3603e49fd28b172181db5b3617ac353 Apr 23 13:33:37.420270 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.420239 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm"] Apr 23 13:33:37.423215 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.423189 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" Apr 23 13:33:37.425702 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.425675 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-root-ca.crt\"" Apr 23 13:33:37.426503 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.426489 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-kube-storage-version-migrator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:37.426588 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.426512 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-kube-storage-version-migrator\"/\"kube-storage-version-migrator-sa-dockercfg-x4fb8\"" Apr 23 13:33:37.433027 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.433000 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm"] Apr 23 13:33:37.482434 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.482404 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8svtm\" (UniqueName: \"kubernetes.io/projected/e1215a93-a289-40d1-8bf9-bcdfac128f1a-kube-api-access-8svtm\") pod \"migrator-74bb7799d9-xmbzm\" (UID: \"e1215a93-a289-40d1-8bf9-bcdfac128f1a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" Apr 23 13:33:37.583325 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.583293 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8svtm\" (UniqueName: \"kubernetes.io/projected/e1215a93-a289-40d1-8bf9-bcdfac128f1a-kube-api-access-8svtm\") pod \"migrator-74bb7799d9-xmbzm\" (UID: \"e1215a93-a289-40d1-8bf9-bcdfac128f1a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" Apr 23 13:33:37.591820 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.591786 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8svtm\" (UniqueName: \"kubernetes.io/projected/e1215a93-a289-40d1-8bf9-bcdfac128f1a-kube-api-access-8svtm\") pod \"migrator-74bb7799d9-xmbzm\" (UID: \"e1215a93-a289-40d1-8bf9-bcdfac128f1a\") " pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" Apr 23 13:33:37.614684 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.614649 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" event={"ID":"7d135553-0468-4a2d-acf5-076ed7430660","Type":"ContainerStarted","Data":"e6928ce5fb782f8ddb8ab7aeb5d9595f86b613d807c45fc85223da419e460304"} Apr 23 13:33:37.614825 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.614690 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" event={"ID":"7d135553-0468-4a2d-acf5-076ed7430660","Type":"ContainerStarted","Data":"7d1c0bfc353ed81e45a9ff8427a1bc68c3603e49fd28b172181db5b3617ac353"} Apr 23 13:33:37.630984 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.630931 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-source-8894fc9bd-7bc2w" podStartSLOduration=1.63089863 podStartE2EDuration="1.63089863s" podCreationTimestamp="2026-04-23 13:33:36 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:33:37.630713976 +0000 UTC m=+159.902643311" watchObservedRunningTime="2026-04-23 13:33:37.63089863 +0000 UTC m=+159.902827971" Apr 23 13:33:37.735392 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:37.735307 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" Apr 23 13:33:38.043501 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:38.043377 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm"] Apr 23 13:33:38.045723 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:33:38.045698 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode1215a93_a289_40d1_8bf9_bcdfac128f1a.slice/crio-d3a8a408c0090218eb9041e989dd5e735b4caad25c8fcef146d802af630e2c9b WatchSource:0}: Error finding container d3a8a408c0090218eb9041e989dd5e735b4caad25c8fcef146d802af630e2c9b: Status 404 returned error can't find the container with id d3a8a408c0090218eb9041e989dd5e735b4caad25c8fcef146d802af630e2c9b Apr 23 13:33:38.618253 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:38.618211 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" event={"ID":"09175df8-e13c-48b2-84da-c1e5469f683f","Type":"ContainerStarted","Data":"89b8168d8db7a8a7bb002b511c1f9dd9bb3fdcc601ee827bfd5e95a86b8bd106"} Apr 23 13:33:38.619264 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:38.619239 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" event={"ID":"e1215a93-a289-40d1-8bf9-bcdfac128f1a","Type":"ContainerStarted","Data":"d3a8a408c0090218eb9041e989dd5e735b4caad25c8fcef146d802af630e2c9b"} Apr 23 13:33:38.634114 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:38.634061 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" podStartSLOduration=1.7940284640000002 podStartE2EDuration="3.634047438s" podCreationTimestamp="2026-04-23 13:33:35 +0000 UTC" firstStartedPulling="2026-04-23 13:33:36.132957049 +0000 UTC m=+158.404886361" lastFinishedPulling="2026-04-23 13:33:37.972976009 +0000 UTC m=+160.244905335" observedRunningTime="2026-04-23 13:33:38.633396159 +0000 UTC m=+160.905325493" watchObservedRunningTime="2026-04-23 13:33:38.634047438 +0000 UTC m=+160.905976773" Apr 23 13:33:39.600610 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.600521 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") pod \"image-registry-68c655548f-7bjs7\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:33:39.600610 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.600578 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.600619 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.600647 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600677 2581 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600695 2581 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-68c655548f-7bjs7: secret "image-registry-tls" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600743 2581 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600741 2581 secret.go:189] Couldn't get secret openshift-network-console/networking-console-plugin-cert: secret "networking-console-plugin-cert" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600739 2581 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600755 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls podName:4965cc34-c960-48a4-926f-7fa8eb0f7e5d nodeName:}" failed. No retries permitted until 2026-04-23 13:35:41.600733595 +0000 UTC m=+283.872662926 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls") pod "image-registry-68c655548f-7bjs7" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d") : secret "image-registry-tls" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600827 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls podName:0b4ad106-4e39-4a55-96b5-d5f06ffb38f4 nodeName:}" failed. No retries permitted until 2026-04-23 13:35:41.600813238 +0000 UTC m=+283.872742549 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls") pod "dns-default-785s4" (UID: "0b4ad106-4e39-4a55-96b5-d5f06ffb38f4") : secret "dns-default-metrics-tls" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600839 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert podName:26d41729-e489-4e6c-997e-ca85d3402bba nodeName:}" failed. No retries permitted until 2026-04-23 13:35:41.600831969 +0000 UTC m=+283.872761280 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "networking-console-plugin-cert" (UniqueName: "kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert") pod "networking-console-plugin-cb95c66f6-48vsk" (UID: "26d41729-e489-4e6c-997e-ca85d3402bba") : secret "networking-console-plugin-cert" not found Apr 23 13:33:39.600996 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:33:39.600864 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert podName:0d8b2cf1-4023-4221-a610-b0935e9dd17c nodeName:}" failed. No retries permitted until 2026-04-23 13:35:41.600851565 +0000 UTC m=+283.872780880 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert") pod "ingress-canary-tzxth" (UID: "0d8b2cf1-4023-4221-a610-b0935e9dd17c") : secret "canary-serving-cert" not found Apr 23 13:33:39.623328 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.623300 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" event={"ID":"e1215a93-a289-40d1-8bf9-bcdfac128f1a","Type":"ContainerStarted","Data":"99e55187d1eaaa172b061b4d550e818e5002221429d2994bfef88516429ea568"} Apr 23 13:33:39.623328 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.623331 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" event={"ID":"e1215a93-a289-40d1-8bf9-bcdfac128f1a","Type":"ContainerStarted","Data":"d6f7941c5bba9ff453bcd910811e657af7b7b435a86b61af8189d013b8808ab2"} Apr 23 13:33:39.639500 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:39.639457 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-kube-storage-version-migrator/migrator-74bb7799d9-xmbzm" podStartSLOduration=1.43424378 podStartE2EDuration="2.639444865s" podCreationTimestamp="2026-04-23 13:33:37 +0000 UTC" firstStartedPulling="2026-04-23 13:33:38.047824873 +0000 UTC m=+160.319754190" lastFinishedPulling="2026-04-23 13:33:39.253025963 +0000 UTC m=+161.524955275" observedRunningTime="2026-04-23 13:33:39.638278611 +0000 UTC m=+161.910207981" watchObservedRunningTime="2026-04-23 13:33:39.639444865 +0000 UTC m=+161.911374199" Apr 23 13:33:51.183202 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:33:51.183141 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:34:02.484316 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.484285 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-c5px5"] Apr 23 13:34:02.489474 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.489455 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.493072 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.493033 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qjjwc\"" Apr 23 13:34:02.493454 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.493280 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:02.493454 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.493294 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:02.499716 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.499697 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c5px5"] Apr 23 13:34:02.589896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.589861 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2606ab5f-c4f7-40b8-8265-6068a4813e3f-crio-socket\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.589896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.589898 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2606ab5f-c4f7-40b8-8265-6068a4813e3f-data-volume\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.590132 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.590046 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2606ab5f-c4f7-40b8-8265-6068a4813e3f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.590132 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.590089 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-89vv7\" (UniqueName: \"kubernetes.io/projected/2606ab5f-c4f7-40b8-8265-6068a4813e3f-kube-api-access-89vv7\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.590263 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.590180 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2606ab5f-c4f7-40b8-8265-6068a4813e3f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690468 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2606ab5f-c4f7-40b8-8265-6068a4813e3f-crio-socket\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690498 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2606ab5f-c4f7-40b8-8265-6068a4813e3f-data-volume\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690673 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690590 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/2606ab5f-c4f7-40b8-8265-6068a4813e3f-crio-socket\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690708 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690673 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2606ab5f-c4f7-40b8-8265-6068a4813e3f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690708 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690702 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-89vv7\" (UniqueName: \"kubernetes.io/projected/2606ab5f-c4f7-40b8-8265-6068a4813e3f-kube-api-access-89vv7\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690787 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690773 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2606ab5f-c4f7-40b8-8265-6068a4813e3f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.690839 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.690780 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/2606ab5f-c4f7-40b8-8265-6068a4813e3f-data-volume\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.691308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.691289 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/2606ab5f-c4f7-40b8-8265-6068a4813e3f-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.693027 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.693011 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/2606ab5f-c4f7-40b8-8265-6068a4813e3f-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.708201 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.708181 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-89vv7\" (UniqueName: \"kubernetes.io/projected/2606ab5f-c4f7-40b8-8265-6068a4813e3f-kube-api-access-89vv7\") pod \"insights-runtime-extractor-c5px5\" (UID: \"2606ab5f-c4f7-40b8-8265-6068a4813e3f\") " pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.798192 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.798104 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-c5px5" Apr 23 13:34:02.929523 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:02.929492 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-c5px5"] Apr 23 13:34:02.932727 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:02.932698 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2606ab5f_c4f7_40b8_8265_6068a4813e3f.slice/crio-600151d95eb99321a8610c7e007d3a8d9f659e571f64568b1edabe4cbc1992b2 WatchSource:0}: Error finding container 600151d95eb99321a8610c7e007d3a8d9f659e571f64568b1edabe4cbc1992b2: Status 404 returned error can't find the container with id 600151d95eb99321a8610c7e007d3a8d9f659e571f64568b1edabe4cbc1992b2 Apr 23 13:34:03.529019 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.528990 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q"] Apr 23 13:34:03.532451 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.532434 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:03.535086 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.535059 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 23 13:34:03.535286 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.535105 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-8rpzk\"" Apr 23 13:34:03.539555 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.539531 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q"] Apr 23 13:34:03.597953 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.597931 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6353867-b800-4aa4-95f5-471fb682e788-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fd27q\" (UID: \"b6353867-b800-4aa4-95f5-471fb682e788\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:03.680189 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.680135 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5px5" event={"ID":"2606ab5f-c4f7-40b8-8265-6068a4813e3f","Type":"ContainerStarted","Data":"0b4ac87959dd1d625c4d4ac8ab613c23ff939691679d1382f5b25319f683e5e0"} Apr 23 13:34:03.680280 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.680194 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5px5" event={"ID":"2606ab5f-c4f7-40b8-8265-6068a4813e3f","Type":"ContainerStarted","Data":"97362ba8a10d0272358946f592e6137b84e20c6be7d77f0f0772fc18428b2dbc"} Apr 23 13:34:03.680280 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.680205 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5px5" event={"ID":"2606ab5f-c4f7-40b8-8265-6068a4813e3f","Type":"ContainerStarted","Data":"600151d95eb99321a8610c7e007d3a8d9f659e571f64568b1edabe4cbc1992b2"} Apr 23 13:34:03.698516 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:03.698499 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6353867-b800-4aa4-95f5-471fb682e788-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fd27q\" (UID: \"b6353867-b800-4aa4-95f5-471fb682e788\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:03.698615 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:03.698590 2581 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-admission-webhook-tls: secret "prometheus-operator-admission-webhook-tls" not found Apr 23 13:34:03.698651 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:03.698637 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b6353867-b800-4aa4-95f5-471fb682e788-tls-certificates podName:b6353867-b800-4aa4-95f5-471fb682e788 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:04.198623234 +0000 UTC m=+186.470552546 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certificates" (UniqueName: "kubernetes.io/secret/b6353867-b800-4aa4-95f5-471fb682e788-tls-certificates") pod "prometheus-operator-admission-webhook-57cf98b594-fd27q" (UID: "b6353867-b800-4aa4-95f5-471fb682e788") : secret "prometheus-operator-admission-webhook-tls" not found Apr 23 13:34:04.201932 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:04.201897 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6353867-b800-4aa4-95f5-471fb682e788-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fd27q\" (UID: \"b6353867-b800-4aa4-95f5-471fb682e788\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:04.204552 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:04.204526 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/b6353867-b800-4aa4-95f5-471fb682e788-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-fd27q\" (UID: \"b6353867-b800-4aa4-95f5-471fb682e788\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:04.441470 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:04.441429 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:04.571305 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:04.571282 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q"] Apr 23 13:34:04.956525 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:04.956488 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb6353867_b800_4aa4_95f5_471fb682e788.slice/crio-a8eca381b63acf0c7e4052b8bdc6bc0af2bf360de8d1db755fc7d1e21f1b957d WatchSource:0}: Error finding container a8eca381b63acf0c7e4052b8bdc6bc0af2bf360de8d1db755fc7d1e21f1b957d: Status 404 returned error can't find the container with id a8eca381b63acf0c7e4052b8bdc6bc0af2bf360de8d1db755fc7d1e21f1b957d Apr 23 13:34:05.688365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:05.688323 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-c5px5" event={"ID":"2606ab5f-c4f7-40b8-8265-6068a4813e3f","Type":"ContainerStarted","Data":"1e7317a4c7ef978d521ccf498bfe3168f219d9c3d6dcc8d00776279ec64819c4"} Apr 23 13:34:05.689490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:05.689466 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" event={"ID":"b6353867-b800-4aa4-95f5-471fb682e788","Type":"ContainerStarted","Data":"a8eca381b63acf0c7e4052b8bdc6bc0af2bf360de8d1db755fc7d1e21f1b957d"} Apr 23 13:34:05.711329 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:05.711282 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-c5px5" podStartSLOduration=1.694926039 podStartE2EDuration="3.711268192s" podCreationTimestamp="2026-04-23 13:34:02 +0000 UTC" firstStartedPulling="2026-04-23 13:34:02.984640015 +0000 UTC m=+185.256569327" lastFinishedPulling="2026-04-23 13:34:05.000982128 +0000 UTC m=+187.272911480" observedRunningTime="2026-04-23 13:34:05.710808608 +0000 UTC m=+187.982737941" watchObservedRunningTime="2026-04-23 13:34:05.711268192 +0000 UTC m=+187.983197525" Apr 23 13:34:06.692712 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:06.692675 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" event={"ID":"b6353867-b800-4aa4-95f5-471fb682e788","Type":"ContainerStarted","Data":"320da93061f2075a7f33663644233e23ea3cc3ebd8d236952b6398daae019998"} Apr 23 13:34:06.693109 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:06.692990 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:06.697607 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:06.697586 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" Apr 23 13:34:06.712019 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:06.711974 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-fd27q" podStartSLOduration=2.559016898 podStartE2EDuration="3.711964085s" podCreationTimestamp="2026-04-23 13:34:03 +0000 UTC" firstStartedPulling="2026-04-23 13:34:04.958838169 +0000 UTC m=+187.230767485" lastFinishedPulling="2026-04-23 13:34:06.11178536 +0000 UTC m=+188.383714672" observedRunningTime="2026-04-23 13:34:06.71044423 +0000 UTC m=+188.982373563" watchObservedRunningTime="2026-04-23 13:34:06.711964085 +0000 UTC m=+188.983893441" Apr 23 13:34:11.895878 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.895846 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w"] Apr 23 13:34:11.899244 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.899224 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:11.901630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.901598 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:34:11.901748 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.901697 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:34:11.901748 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.901698 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:34:11.902629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.902611 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-dockercfg-wv7f4\"" Apr 23 13:34:11.902718 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.902641 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"openshift-state-metrics-tls\"" Apr 23 13:34:11.902718 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.902655 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:11.905529 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.905510 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6ffj6"] Apr 23 13:34:11.911381 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.911363 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w"] Apr 23 13:34:11.911694 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.911580 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:11.914856 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.914834 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-kube-rbac-proxy-config\"" Apr 23 13:34:11.914939 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.914865 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-custom-resource-state-configmap\"" Apr 23 13:34:11.914939 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.914915 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-tls\"" Apr 23 13:34:11.915054 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.914919 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-state-metrics-dockercfg-ps7dr\"" Apr 23 13:34:11.921348 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.921330 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6ffj6"] Apr 23 13:34:11.934242 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.934218 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-bndgx"] Apr 23 13:34:11.937309 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.937292 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:11.939474 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.939452 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:11.939573 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.939482 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:11.939573 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.939560 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-d2mkv\"" Apr 23 13:34:11.939691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:11.939643 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:12.070408 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070365 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97bff001-f602-4b45-914b-959cae86353d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.070587 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070448 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.070587 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070472 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-textfile\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.070587 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070493 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.070587 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070527 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.070587 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070551 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8887j\" (UniqueName: \"kubernetes.io/projected/4a3714a4-0aab-49d4-9386-940e1b4abedf-kube-api-access-8887j\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.070587 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070575 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070592 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3714a4-0aab-49d4-9386-940e1b4abedf-metrics-client-ca\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070611 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-tls\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070637 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j8j7v\" (UniqueName: \"kubernetes.io/projected/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-kube-api-access-j8j7v\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070658 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-accelerators-collector-config\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070677 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070700 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.070781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070767 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.071009 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070785 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-root\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.071009 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070807 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/97bff001-f602-4b45-914b-959cae86353d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.071009 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070821 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zf795\" (UniqueName: \"kubernetes.io/projected/97bff001-f602-4b45-914b-959cae86353d-kube-api-access-zf795\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.071009 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070842 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-sys\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.071009 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.070878 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-wtmp\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172187 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172082 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.172187 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172113 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3714a4-0aab-49d4-9386-940e1b4abedf-metrics-client-ca\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172187 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-tls\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172187 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172176 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j8j7v\" (UniqueName: \"kubernetes.io/projected/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-kube-api-access-j8j7v\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172206 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-accelerators-collector-config\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172234 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172255 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:12.172277 2581 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172307 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172341 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-root\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:12.172359 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-tls podName:4a3714a4-0aab-49d4-9386-940e1b4abedf nodeName:}" failed. No retries permitted until 2026-04-23 13:34:12.672337465 +0000 UTC m=+194.944266777 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-tls") pod "node-exporter-bndgx" (UID: "4a3714a4-0aab-49d4-9386-940e1b4abedf") : secret "node-exporter-tls" not found Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172379 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-root\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172416 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/97bff001-f602-4b45-914b-959cae86353d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172447 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zf795\" (UniqueName: \"kubernetes.io/projected/97bff001-f602-4b45-914b-959cae86353d-kube-api-access-zf795\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.172497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172490 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-sys\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172519 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-wtmp\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172558 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97bff001-f602-4b45-914b-959cae86353d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172607 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172626 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-textfile\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172646 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172651 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-sys\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172694 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172720 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8887j\" (UniqueName: \"kubernetes.io/projected/4a3714a4-0aab-49d4-9386-940e1b4abedf-kube-api-access-8887j\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.173065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.172830 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"volume-directive-shadow\" (UniqueName: \"kubernetes.io/empty-dir/97bff001-f602-4b45-914b-959cae86353d-volume-directive-shadow\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.173539 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.173278 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-wtmp\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.173539 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.173494 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-custom-resource-state-configmap\" (UniqueName: \"kubernetes.io/configmap/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-custom-resource-state-configmap\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.173539 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.173514 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/97bff001-f602-4b45-914b-959cae86353d-metrics-client-ca\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.173693 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.173545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-metrics-client-ca\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.173693 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:12.173626 2581 secret.go:189] Couldn't get secret openshift-monitoring/kube-state-metrics-tls: secret "kube-state-metrics-tls" not found Apr 23 13:34:12.173791 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:12.173711 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-tls podName:97bff001-f602-4b45-914b-959cae86353d nodeName:}" failed. No retries permitted until 2026-04-23 13:34:12.673692743 +0000 UTC m=+194.945622060 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-state-metrics-tls" (UniqueName: "kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-tls") pod "kube-state-metrics-69db897b98-6ffj6" (UID: "97bff001-f602-4b45-914b-959cae86353d") : secret "kube-state-metrics-tls" not found Apr 23 13:34:12.173864 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.173844 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-textfile\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.174107 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.174085 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-accelerators-collector-config\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.174525 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.174507 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/4a3714a4-0aab-49d4-9386-940e1b4abedf-metrics-client-ca\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.175178 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.175139 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.175367 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.175348 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-kube-rbac-proxy-config\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.175409 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.175351 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-openshift-state-metrics-tls\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.175707 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.175690 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"openshift-state-metrics-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-openshift-state-metrics-kube-rbac-proxy-config\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.183333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.183301 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j8j7v\" (UniqueName: \"kubernetes.io/projected/5a35ac1c-e120-48ab-9e65-e5eb9465a9ab-kube-api-access-j8j7v\") pod \"openshift-state-metrics-9d44df66c-6nr9w\" (UID: \"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab\") " pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.184437 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.184412 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8887j\" (UniqueName: \"kubernetes.io/projected/4a3714a4-0aab-49d4-9386-940e1b4abedf-kube-api-access-8887j\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.184543 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.184524 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zf795\" (UniqueName: \"kubernetes.io/projected/97bff001-f602-4b45-914b-959cae86353d-kube-api-access-zf795\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.209981 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.209958 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" Apr 23 13:34:12.328919 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.328823 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w"] Apr 23 13:34:12.331511 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:12.331483 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5a35ac1c_e120_48ab_9e65_e5eb9465a9ab.slice/crio-9b6996fab4cd6f5c696d4b59bbb653d0916f2995d383aac80bde695afcb9bdcc WatchSource:0}: Error finding container 9b6996fab4cd6f5c696d4b59bbb653d0916f2995d383aac80bde695afcb9bdcc: Status 404 returned error can't find the container with id 9b6996fab4cd6f5c696d4b59bbb653d0916f2995d383aac80bde695afcb9bdcc Apr 23 13:34:12.677973 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.677890 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.678130 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.677999 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-tls\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.680358 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.680340 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/4a3714a4-0aab-49d4-9386-940e1b4abedf-node-exporter-tls\") pod \"node-exporter-bndgx\" (UID: \"4a3714a4-0aab-49d4-9386-940e1b4abedf\") " pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.680426 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.680353 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-state-metrics-tls\" (UniqueName: \"kubernetes.io/secret/97bff001-f602-4b45-914b-959cae86353d-kube-state-metrics-tls\") pod \"kube-state-metrics-69db897b98-6ffj6\" (UID: \"97bff001-f602-4b45-914b-959cae86353d\") " pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.708811 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.708778 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" event={"ID":"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab","Type":"ContainerStarted","Data":"581f19034ce8b1861a4229232d227bbb4be7abd7a4d97f4968f1184ca9870fef"} Apr 23 13:34:12.708811 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.708813 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" event={"ID":"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab","Type":"ContainerStarted","Data":"c0f2ae16765d8635b1cb88e91f89aa5927894cd9cfab7fe43a77020c9909f311"} Apr 23 13:34:12.709002 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.708823 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" event={"ID":"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab","Type":"ContainerStarted","Data":"9b6996fab4cd6f5c696d4b59bbb653d0916f2995d383aac80bde695afcb9bdcc"} Apr 23 13:34:12.820959 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.820924 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" Apr 23 13:34:12.847080 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.847048 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-bndgx" Apr 23 13:34:12.856218 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:12.856187 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4a3714a4_0aab_49d4_9386_940e1b4abedf.slice/crio-38c62dd1c125bdddd10d21df020301a626b1d049d75fac03bf4d90956aabec0b WatchSource:0}: Error finding container 38c62dd1c125bdddd10d21df020301a626b1d049d75fac03bf4d90956aabec0b: Status 404 returned error can't find the container with id 38c62dd1c125bdddd10d21df020301a626b1d049d75fac03bf4d90956aabec0b Apr 23 13:34:12.948352 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:12.948260 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/kube-state-metrics-69db897b98-6ffj6"] Apr 23 13:34:12.952835 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:12.952803 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod97bff001_f602_4b45_914b_959cae86353d.slice/crio-259e3816591d98dc33acee1aa85ec706f9ff31c9a6cc1b8b4e1262eff1af5fb3 WatchSource:0}: Error finding container 259e3816591d98dc33acee1aa85ec706f9ff31c9a6cc1b8b4e1262eff1af5fb3: Status 404 returned error can't find the container with id 259e3816591d98dc33acee1aa85ec706f9ff31c9a6cc1b8b4e1262eff1af5fb3 Apr 23 13:34:13.713739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:13.713705 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" event={"ID":"5a35ac1c-e120-48ab-9e65-e5eb9465a9ab","Type":"ContainerStarted","Data":"8603af1a9977a60fe29dd63efd49b4ba6b4321b68be427df87b893361b2fec13"} Apr 23 13:34:13.714961 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:13.714931 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" event={"ID":"97bff001-f602-4b45-914b-959cae86353d","Type":"ContainerStarted","Data":"259e3816591d98dc33acee1aa85ec706f9ff31c9a6cc1b8b4e1262eff1af5fb3"} Apr 23 13:34:13.716002 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:13.715979 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bndgx" event={"ID":"4a3714a4-0aab-49d4-9386-940e1b4abedf","Type":"ContainerStarted","Data":"38c62dd1c125bdddd10d21df020301a626b1d049d75fac03bf4d90956aabec0b"} Apr 23 13:34:13.732791 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:13.732746 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/openshift-state-metrics-9d44df66c-6nr9w" podStartSLOduration=1.817953355 podStartE2EDuration="2.732729637s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="2026-04-23 13:34:12.435732301 +0000 UTC m=+194.707661615" lastFinishedPulling="2026-04-23 13:34:13.35050857 +0000 UTC m=+195.622437897" observedRunningTime="2026-04-23 13:34:13.731345441 +0000 UTC m=+196.003274790" watchObservedRunningTime="2026-04-23 13:34:13.732729637 +0000 UTC m=+196.004658972" Apr 23 13:34:14.721014 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:14.720969 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" event={"ID":"97bff001-f602-4b45-914b-959cae86353d","Type":"ContainerStarted","Data":"20b879d0a97767b61c0ab4fd0b1b1202c409ef0256d5b9f00fb16afa67d6788f"} Apr 23 13:34:14.721014 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:14.721015 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" event={"ID":"97bff001-f602-4b45-914b-959cae86353d","Type":"ContainerStarted","Data":"6d9a4aa758a50109c0627cc67be3de7f331f18fbaf9a929bfa9a3f667e58f171"} Apr 23 13:34:14.721553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:14.721026 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" event={"ID":"97bff001-f602-4b45-914b-959cae86353d","Type":"ContainerStarted","Data":"1b25d917fa5df25143d6beadd6cd58123a875413f715a388464af239e949d7e0"} Apr 23 13:34:14.722460 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:14.722430 2581 generic.go:358] "Generic (PLEG): container finished" podID="4a3714a4-0aab-49d4-9386-940e1b4abedf" containerID="937e643ec049f9eb34cbd63478b5e347bb57fb7a4907d293638593a6b1826604" exitCode=0 Apr 23 13:34:14.722542 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:14.722509 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bndgx" event={"ID":"4a3714a4-0aab-49d4-9386-940e1b4abedf","Type":"ContainerDied","Data":"937e643ec049f9eb34cbd63478b5e347bb57fb7a4907d293638593a6b1826604"} Apr 23 13:34:14.739755 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:14.739711 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/kube-state-metrics-69db897b98-6ffj6" podStartSLOduration=2.4783943600000002 podStartE2EDuration="3.739694917s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="2026-04-23 13:34:12.955503323 +0000 UTC m=+195.227432638" lastFinishedPulling="2026-04-23 13:34:14.216803868 +0000 UTC m=+196.488733195" observedRunningTime="2026-04-23 13:34:14.737724745 +0000 UTC m=+197.009654080" watchObservedRunningTime="2026-04-23 13:34:14.739694917 +0000 UTC m=+197.011624252" Apr 23 13:34:15.005882 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.005805 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-7b4c896dd7-mpf47"] Apr 23 13:34:15.009624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.009605 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.011931 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.011910 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 23 13:34:15.012048 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.011911 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 23 13:34:15.012048 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.011963 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-6tekmhiujaamg\"" Apr 23 13:34:15.012048 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.012029 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 23 13:34:15.012308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.012289 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 23 13:34:15.012368 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.012327 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 23 13:34:15.012418 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.012297 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-55zrc\"" Apr 23 13:34:15.019486 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.019467 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b4c896dd7-mpf47"] Apr 23 13:34:15.103081 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103025 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-s7qtc\" (UniqueName: \"kubernetes.io/projected/5253e9cf-063e-45ac-a8c8-f8a909c4003a-kube-api-access-s7qtc\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103292 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103084 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103292 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103202 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103292 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103228 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5253e9cf-063e-45ac-a8c8-f8a909c4003a-metrics-client-ca\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103292 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103254 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103422 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103329 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103422 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103399 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-tls\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.103482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.103425 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-grpc-tls\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.204706 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204658 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-tls\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.204706 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204713 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-grpc-tls\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.204914 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204762 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-s7qtc\" (UniqueName: \"kubernetes.io/projected/5253e9cf-063e-45ac-a8c8-f8a909c4003a-kube-api-access-s7qtc\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.204914 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204800 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.204987 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204922 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.204987 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204959 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5253e9cf-063e-45ac-a8c8-f8a909c4003a-metrics-client-ca\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.205089 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.204993 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.205214 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.205189 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.205745 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.205719 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/5253e9cf-063e-45ac-a8c8-f8a909c4003a-metrics-client-ca\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.207582 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.207558 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-grpc-tls\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.207697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.207677 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.207862 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.207838 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.207903 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.207892 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.207934 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.207904 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-tls\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.207934 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.207914 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/5253e9cf-063e-45ac-a8c8-f8a909c4003a-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.212264 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.212245 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-s7qtc\" (UniqueName: \"kubernetes.io/projected/5253e9cf-063e-45ac-a8c8-f8a909c4003a-kube-api-access-s7qtc\") pod \"thanos-querier-7b4c896dd7-mpf47\" (UID: \"5253e9cf-063e-45ac-a8c8-f8a909c4003a\") " pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.319353 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.319259 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:15.444787 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.444448 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-7b4c896dd7-mpf47"] Apr 23 13:34:15.447493 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:15.447467 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5253e9cf_063e_45ac_a8c8_f8a909c4003a.slice/crio-61ab5eaf0f79073c1bec5502e5ddfcda41c9a68ae66cfc3f4d61ca0d5492e69c WatchSource:0}: Error finding container 61ab5eaf0f79073c1bec5502e5ddfcda41c9a68ae66cfc3f4d61ca0d5492e69c: Status 404 returned error can't find the container with id 61ab5eaf0f79073c1bec5502e5ddfcda41c9a68ae66cfc3f4d61ca0d5492e69c Apr 23 13:34:15.726579 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.726541 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"61ab5eaf0f79073c1bec5502e5ddfcda41c9a68ae66cfc3f4d61ca0d5492e69c"} Apr 23 13:34:15.728505 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.728478 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bndgx" event={"ID":"4a3714a4-0aab-49d4-9386-940e1b4abedf","Type":"ContainerStarted","Data":"2b097d7c18aa12ddfee0f3fe4ea3607fb0ab510a33d5341f3e4fffd6b988bf6d"} Apr 23 13:34:15.728505 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.728507 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-bndgx" event={"ID":"4a3714a4-0aab-49d4-9386-940e1b4abedf","Type":"ContainerStarted","Data":"0b3f097f98630fbab77140cbf4ce2b8311383d1179e60d553b31b2dd421a5251"} Apr 23 13:34:15.746586 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:15.746542 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-bndgx" podStartSLOduration=3.8754582429999997 podStartE2EDuration="4.746527466s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="2026-04-23 13:34:12.858504437 +0000 UTC m=+195.130433754" lastFinishedPulling="2026-04-23 13:34:13.729573658 +0000 UTC m=+196.001502977" observedRunningTime="2026-04-23 13:34:15.745290312 +0000 UTC m=+198.017219646" watchObservedRunningTime="2026-04-23 13:34:15.746527466 +0000 UTC m=+198.018456799" Apr 23 13:34:16.685510 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.685474 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9"] Apr 23 13:34:16.688584 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.688569 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:16.690679 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.690659 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 23 13:34:16.690772 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.690730 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-9v7ch\"" Apr 23 13:34:16.695041 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.695017 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9"] Apr 23 13:34:16.821983 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.821947 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/791eaf8b-2188-43eb-8223-cc415e1fd93f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9czs9\" (UID: \"791eaf8b-2188-43eb-8223-cc415e1fd93f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:16.923511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.923470 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/791eaf8b-2188-43eb-8223-cc415e1fd93f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9czs9\" (UID: \"791eaf8b-2188-43eb-8223-cc415e1fd93f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:16.926297 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.926263 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/791eaf8b-2188-43eb-8223-cc415e1fd93f-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-9czs9\" (UID: \"791eaf8b-2188-43eb-8223-cc415e1fd93f\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:17.000049 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:16.999964 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:17.532782 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:17.532744 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9"] Apr 23 13:34:17.536117 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:17.536092 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod791eaf8b_2188_43eb_8223_cc415e1fd93f.slice/crio-7039fab3b1ccea605eddb2fd5b37a0634eedb6861597e7a0a180b8f3afddde3a WatchSource:0}: Error finding container 7039fab3b1ccea605eddb2fd5b37a0634eedb6861597e7a0a180b8f3afddde3a: Status 404 returned error can't find the container with id 7039fab3b1ccea605eddb2fd5b37a0634eedb6861597e7a0a180b8f3afddde3a Apr 23 13:34:17.735632 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:17.735548 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" event={"ID":"791eaf8b-2188-43eb-8223-cc415e1fd93f","Type":"ContainerStarted","Data":"7039fab3b1ccea605eddb2fd5b37a0634eedb6861597e7a0a180b8f3afddde3a"} Apr 23 13:34:17.737370 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:17.737346 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"86316af445e62b7d316c90372ff9b9fbccfbdff46984b6315bef494d75154736"} Apr 23 13:34:17.737473 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:17.737375 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"3934755a5f7f497b64460f6dc9b84945adc3a36c22e79d5ecbf9422a7ca16c57"} Apr 23 13:34:17.737473 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:17.737385 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"4f3f4842df818474cf42cedbe4201578de37bd11fafa91040fda12a9b54c6ea7"} Apr 23 13:34:18.160043 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.159645 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:18.163981 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.163959 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.166485 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.166456 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:34:18.167004 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.166866 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:34:18.167112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.167013 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:34:18.167353 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.167325 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:34:18.167757 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.167737 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:34:18.167887 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.167870 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:34:18.168091 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168071 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3ujmhnjjekjun\"" Apr 23 13:34:18.168236 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168098 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:34:18.168236 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168138 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:34:18.168236 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168176 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:34:18.168419 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168333 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:34:18.168419 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168372 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:34:18.168419 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168395 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:34:18.168601 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.168482 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-xmwp2\"" Apr 23 13:34:18.170626 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.170605 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:34:18.176485 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.176460 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:18.334539 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334508 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z4bkl\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-kube-api-access-z4bkl\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334678 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334558 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334678 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334641 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334678 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334675 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334735 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config-out\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334761 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.334832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334795 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334842 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334867 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334898 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334926 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334968 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.334984 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.335002 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.335036 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-web-config\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.335057 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.335311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.335082 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436058 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436026 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436065 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436082 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436104 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-web-config\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436135 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436170 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436208 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z4bkl\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-kube-api-access-z4bkl\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436228 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436273 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436295 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436344 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436364 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config-out\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436383 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436414 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436456 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436485 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436514 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.436752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.436548 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.437322 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.437230 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.440651 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.440317 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.441036 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.441000 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.441119 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.441055 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-web-config\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.441498 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.441236 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.441932 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.441904 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.442020 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.441956 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.443096 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.443073 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.444694 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.444665 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.444789 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.444756 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.445456 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.445337 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.447189 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.447127 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z4bkl\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-kube-api-access-z4bkl\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.447272 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.447230 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.447792 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.447741 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.448265 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.448212 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.448695 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.448627 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.449084 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.449023 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config-out\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.449583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.449561 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.479246 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.478931 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:18.623497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.623460 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:34:18.743344 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.743263 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"0e98db9ae814079bb33be01f60c90e6236eb565611efe16a7177d4ce3952bc83"} Apr 23 13:34:18.743344 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.743300 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"65d2a70426cc52fb1f1cf151ee0b829903a57a8205bfec025bf79529b3ca7311"} Apr 23 13:34:18.743344 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.743317 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" event={"ID":"5253e9cf-063e-45ac-a8c8-f8a909c4003a","Type":"ContainerStarted","Data":"df2fdf3d3e0c4f41298e401fd6b42cf02acea3606bb1c4623513fcf9ac8222f1"} Apr 23 13:34:18.743575 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.743472 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:18.764743 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:18.764690 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" podStartSLOduration=1.915452063 podStartE2EDuration="4.764673754s" podCreationTimestamp="2026-04-23 13:34:14 +0000 UTC" firstStartedPulling="2026-04-23 13:34:15.449510529 +0000 UTC m=+197.721439841" lastFinishedPulling="2026-04-23 13:34:18.29873222 +0000 UTC m=+200.570661532" observedRunningTime="2026-04-23 13:34:18.764221009 +0000 UTC m=+201.036150343" watchObservedRunningTime="2026-04-23 13:34:18.764673754 +0000 UTC m=+201.036603090" Apr 23 13:34:18.894350 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:34:18.894315 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40f7a79c_aa7b_4aad_999b_1daee200d8ab.slice/crio-0c925f69e87506c279f8ecd3c519ebe6504e604fb529ffaa9b73d4a93a791e20 WatchSource:0}: Error finding container 0c925f69e87506c279f8ecd3c519ebe6504e604fb529ffaa9b73d4a93a791e20: Status 404 returned error can't find the container with id 0c925f69e87506c279f8ecd3c519ebe6504e604fb529ffaa9b73d4a93a791e20 Apr 23 13:34:19.747079 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:19.747040 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"0c925f69e87506c279f8ecd3c519ebe6504e604fb529ffaa9b73d4a93a791e20"} Apr 23 13:34:19.748567 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:19.748534 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" event={"ID":"791eaf8b-2188-43eb-8223-cc415e1fd93f","Type":"ContainerStarted","Data":"151bf84231d7e093717fd0ce07c200da3684ed4c8dd41693e7cce32083c9f02d"} Apr 23 13:34:19.748713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:19.748694 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:19.754045 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:19.754024 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" Apr 23 13:34:19.763197 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:19.763129 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-9czs9" podStartSLOduration=2.361499864 podStartE2EDuration="3.76311817s" podCreationTimestamp="2026-04-23 13:34:16 +0000 UTC" firstStartedPulling="2026-04-23 13:34:17.538289854 +0000 UTC m=+199.810219165" lastFinishedPulling="2026-04-23 13:34:18.939908156 +0000 UTC m=+201.211837471" observedRunningTime="2026-04-23 13:34:19.76272139 +0000 UTC m=+202.034650725" watchObservedRunningTime="2026-04-23 13:34:19.76311817 +0000 UTC m=+202.035047503" Apr 23 13:34:20.752758 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:20.752718 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27" exitCode=0 Apr 23 13:34:20.753193 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:20.752806 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27"} Apr 23 13:34:23.766045 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:23.766019 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e"} Apr 23 13:34:23.766365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:23.766053 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04"} Apr 23 13:34:23.766365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:23.766062 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91"} Apr 23 13:34:23.766365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:23.766070 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349"} Apr 23 13:34:24.332909 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.332880 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68c655548f-7bjs7"] Apr 23 13:34:24.333115 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:34:24.333096 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" podUID="4965cc34-c960-48a4-926f-7fa8eb0f7e5d" Apr 23 13:34:24.755381 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.755355 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-7b4c896dd7-mpf47" Apr 23 13:34:24.772544 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.772519 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:34:24.772886 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.772520 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2"} Apr 23 13:34:24.772886 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.772626 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerStarted","Data":"2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c"} Apr 23 13:34:24.777018 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.777001 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:34:24.809635 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.809583 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-image-registry-private-configuration\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.809791 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.809654 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-trusted-ca\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.809791 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.809700 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-installation-pull-secrets\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.809791 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.809760 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-ca-trust-extracted\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.809960 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.809806 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-certificates\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.809960 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.809831 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xqzpw\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-kube-api-access-xqzpw\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.810167 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.810111 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-bound-sa-token\") pod \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\" (UID: \"4965cc34-c960-48a4-926f-7fa8eb0f7e5d\") " Apr 23 13:34:24.811096 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.810481 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:34:24.811096 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.810539 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:24.811528 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.811256 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.268489604 podStartE2EDuration="6.811240685s" podCreationTimestamp="2026-04-23 13:34:18 +0000 UTC" firstStartedPulling="2026-04-23 13:34:18.896110348 +0000 UTC m=+201.168039660" lastFinishedPulling="2026-04-23 13:34:23.438861426 +0000 UTC m=+205.710790741" observedRunningTime="2026-04-23 13:34:24.81011167 +0000 UTC m=+207.082041037" watchObservedRunningTime="2026-04-23 13:34:24.811240685 +0000 UTC m=+207.083170020" Apr 23 13:34:24.811528 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.811321 2581 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-trusted-ca\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:24.811528 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.811344 2581 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-ca-trust-extracted\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:24.813424 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.813388 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:24.813567 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.813464 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:24.814268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.814233 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:24.815137 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.815113 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-kube-api-access-xqzpw" (OuterVolumeSpecName: "kube-api-access-xqzpw") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "kube-api-access-xqzpw". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:24.815942 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.815904 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "4965cc34-c960-48a4-926f-7fa8eb0f7e5d" (UID: "4965cc34-c960-48a4-926f-7fa8eb0f7e5d"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:24.912447 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.912420 2581 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-installation-pull-secrets\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:24.912447 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.912444 2581 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-certificates\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:24.912604 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.912454 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xqzpw\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-kube-api-access-xqzpw\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:24.912604 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.912464 2581 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-bound-sa-token\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:24.912604 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:24.912473 2581 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-image-registry-private-configuration\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:25.775056 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:25.775026 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-68c655548f-7bjs7" Apr 23 13:34:25.807790 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:25.807762 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-68c655548f-7bjs7"] Apr 23 13:34:25.810556 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:25.810533 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-68c655548f-7bjs7"] Apr 23 13:34:25.921744 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:25.921700 2581 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/4965cc34-c960-48a4-926f-7fa8eb0f7e5d-registry-tls\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:34:26.190303 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:26.189010 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4965cc34-c960-48a4-926f-7fa8eb0f7e5d" path="/var/lib/kubelet/pods/4965cc34-c960-48a4-926f-7fa8eb0f7e5d/volumes" Apr 23 13:34:28.479618 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:28.479583 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:34:46.839420 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:46.839335 2581 generic.go:358] "Generic (PLEG): container finished" podID="e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc" containerID="a0a02cee6a76cfd60e95b8221bec54113a1345e68c78d904f8b165eaed4d0121" exitCode=0 Apr 23 13:34:46.839420 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:46.839407 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" event={"ID":"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc","Type":"ContainerDied","Data":"a0a02cee6a76cfd60e95b8221bec54113a1345e68c78d904f8b165eaed4d0121"} Apr 23 13:34:46.839807 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:46.839704 2581 scope.go:117] "RemoveContainer" containerID="a0a02cee6a76cfd60e95b8221bec54113a1345e68c78d904f8b165eaed4d0121" Apr 23 13:34:47.844281 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:47.844246 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-kube-storage-version-migrator-operator/kube-storage-version-migrator-operator-6769c5d45-f7btc" event={"ID":"e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc","Type":"ContainerStarted","Data":"fc5543804b2e3756f2f4d50e09bb7d37cddffba8198401333a41cb133e2d68a0"} Apr 23 13:34:58.875492 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:58.875458 2581 generic.go:358] "Generic (PLEG): container finished" podID="09175df8-e13c-48b2-84da-c1e5469f683f" containerID="89b8168d8db7a8a7bb002b511c1f9dd9bb3fdcc601ee827bfd5e95a86b8bd106" exitCode=0 Apr 23 13:34:58.875908 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:58.875521 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" event={"ID":"09175df8-e13c-48b2-84da-c1e5469f683f","Type":"ContainerDied","Data":"89b8168d8db7a8a7bb002b511c1f9dd9bb3fdcc601ee827bfd5e95a86b8bd106"} Apr 23 13:34:58.875908 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:58.875833 2581 scope.go:117] "RemoveContainer" containerID="89b8168d8db7a8a7bb002b511c1f9dd9bb3fdcc601ee827bfd5e95a86b8bd106" Apr 23 13:34:59.879365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:34:59.879331 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-qhlmj" event={"ID":"09175df8-e13c-48b2-84da-c1e5469f683f","Type":"ContainerStarted","Data":"143b9754d3c140cbc381d64035cf0f8969031925572cd287640ebb43a33b33cb"} Apr 23 13:35:01.887636 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:01.887598 2581 generic.go:358] "Generic (PLEG): container finished" podID="d18e4b4d-4c24-4614-b7e1-e4d9ff536c14" containerID="d29488e5c6e830366dde982ed44c81576ddd03cb7f66acf9ff288b5466170343" exitCode=0 Apr 23 13:35:01.888021 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:01.887668 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" event={"ID":"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14","Type":"ContainerDied","Data":"d29488e5c6e830366dde982ed44c81576ddd03cb7f66acf9ff288b5466170343"} Apr 23 13:35:01.888021 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:01.888006 2581 scope.go:117] "RemoveContainer" containerID="d29488e5c6e830366dde982ed44c81576ddd03cb7f66acf9ff288b5466170343" Apr 23 13:35:02.848348 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:02.848325 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-57mv2_72a4caa3-d1c9-4761-adeb-e08cb9c63ab4/dns-node-resolver/0.log" Apr 23 13:35:02.891480 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:02.891452 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-operator-585dfdc468-gzhr9" event={"ID":"d18e4b4d-4c24-4614-b7e1-e4d9ff536c14","Type":"ContainerStarted","Data":"e10074efac51105971e9b0ff6746cc005bc46bd246e58bd631644842abc4409f"} Apr 23 13:35:10.006670 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:10.006578 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:35:10.008973 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:10.008951 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/c3fdc691-5051-4c4c-8360-ed987a28f315-metrics-certs\") pod \"network-metrics-daemon-fpksp\" (UID: \"c3fdc691-5051-4c4c-8360-ed987a28f315\") " pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:35:10.086120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:10.086093 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lbjnn\"" Apr 23 13:35:10.094137 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:10.094123 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-fpksp" Apr 23 13:35:10.214038 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:10.214006 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-fpksp"] Apr 23 13:35:10.217180 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:35:10.217136 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc3fdc691_5051_4c4c_8360_ed987a28f315.slice/crio-4032a103df7ca0e60665580d59255a0fd1bfa87e06ca2bb9e71f34547177ec28 WatchSource:0}: Error finding container 4032a103df7ca0e60665580d59255a0fd1bfa87e06ca2bb9e71f34547177ec28: Status 404 returned error can't find the container with id 4032a103df7ca0e60665580d59255a0fd1bfa87e06ca2bb9e71f34547177ec28 Apr 23 13:35:10.916347 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:10.916302 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fpksp" event={"ID":"c3fdc691-5051-4c4c-8360-ed987a28f315","Type":"ContainerStarted","Data":"4032a103df7ca0e60665580d59255a0fd1bfa87e06ca2bb9e71f34547177ec28"} Apr 23 13:35:11.921447 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:11.920952 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fpksp" event={"ID":"c3fdc691-5051-4c4c-8360-ed987a28f315","Type":"ContainerStarted","Data":"50bf3310dc119d9e3f7ae18c9a59a9e5b2ebe849feab9da18bfd1e490c80b5e5"} Apr 23 13:35:11.921447 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:11.921385 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-fpksp" event={"ID":"c3fdc691-5051-4c4c-8360-ed987a28f315","Type":"ContainerStarted","Data":"0f73556d451ba335c88f0e940381816b986f3071057e230042aadcb2cd96b7b1"} Apr 23 13:35:11.940331 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:11.940286 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-fpksp" podStartSLOduration=252.978123351 podStartE2EDuration="4m13.94026268s" podCreationTimestamp="2026-04-23 13:30:58 +0000 UTC" firstStartedPulling="2026-04-23 13:35:10.219125767 +0000 UTC m=+252.491055078" lastFinishedPulling="2026-04-23 13:35:11.181265094 +0000 UTC m=+253.453194407" observedRunningTime="2026-04-23 13:35:11.938038267 +0000 UTC m=+254.209967598" watchObservedRunningTime="2026-04-23 13:35:11.94026268 +0000 UTC m=+254.212192013" Apr 23 13:35:18.479123 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:18.479084 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:18.498472 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:18.498447 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:18.957347 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:18.957324 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:36.521870 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.521840 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:35:36.522538 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.522487 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="prometheus" containerID="cri-o://4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349" gracePeriod=600 Apr 23 13:35:36.522651 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.522544 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-web" containerID="cri-o://1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e" gracePeriod=600 Apr 23 13:35:36.522651 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.522589 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-thanos" containerID="cri-o://2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2" gracePeriod=600 Apr 23 13:35:36.522651 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.522592 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="config-reloader" containerID="cri-o://2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91" gracePeriod=600 Apr 23 13:35:36.522651 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.522540 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy" containerID="cri-o://2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c" gracePeriod=600 Apr 23 13:35:36.522822 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.522543 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/prometheus-k8s-0" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="thanos-sidecar" containerID="cri-o://b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04" gracePeriod=600 Apr 23 13:35:36.995024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.994991 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2" exitCode=0 Apr 23 13:35:36.995024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995015 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c" exitCode=0 Apr 23 13:35:36.995024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995024 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04" exitCode=0 Apr 23 13:35:36.995024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995030 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91" exitCode=0 Apr 23 13:35:36.995327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995036 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349" exitCode=0 Apr 23 13:35:36.995327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995066 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2"} Apr 23 13:35:36.995327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995105 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c"} Apr 23 13:35:36.995327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995118 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04"} Apr 23 13:35:36.995327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995130 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91"} Apr 23 13:35:36.995327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:36.995141 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349"} Apr 23 13:35:37.764338 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.764309 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:37.849318 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849245 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-serving-certs-ca-bundle\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849318 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849277 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849318 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849311 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-metrics-client-certs\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849348 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-db\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849376 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-rulefiles-0\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849402 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-tls\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849424 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849449 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-metrics-client-ca\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849489 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-kube-rbac-proxy\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849527 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-web-config\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849553 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-trusted-ca-bundle\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849581 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-grpc-tls\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849618 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z4bkl\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-kube-api-access-z4bkl\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849642 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849684 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-thanos-prometheus-http-client-file\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849707 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-tls-assets\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849730 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config-out\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849759 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-kubelet-serving-ca-bundle\") pod \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\" (UID: \"40f7a79c-aa7b-4aad-999b-1daee200d8ab\") " Apr 23 13:35:37.849926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849815 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-serving-certs-ca-bundle" (OuterVolumeSpecName: "configmap-serving-certs-ca-bundle") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "configmap-serving-certs-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:37.850342 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.849962 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-metrics-client-ca" (OuterVolumeSpecName: "configmap-metrics-client-ca") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "configmap-metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:37.850342 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.850055 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-serving-certs-ca-bundle\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.850342 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.850073 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-metrics-client-ca\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.852128 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.851239 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-trusted-ca-bundle" (OuterVolumeSpecName: "prometheus-trusted-ca-bundle") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "prometheus-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:37.852128 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.851383 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-db" (OuterVolumeSpecName: "prometheus-k8s-db") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "prometheus-k8s-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:37.852128 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.851877 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-rulefiles-0" (OuterVolumeSpecName: "prometheus-k8s-rulefiles-0") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "prometheus-k8s-rulefiles-0". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:37.853404 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.853140 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-kubelet-serving-ca-bundle" (OuterVolumeSpecName: "configmap-kubelet-serving-ca-bundle") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "configmap-kubelet-serving-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:37.853404 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.853261 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-kube-api-access-z4bkl" (OuterVolumeSpecName: "kube-api-access-z4bkl") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "kube-api-access-z4bkl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:37.853567 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.853514 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-prometheus-k8s-kube-rbac-proxy-web") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "secret-prometheus-k8s-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.853787 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.853693 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config" (OuterVolumeSpecName: "config") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.853787 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.853762 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-tls") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "secret-prometheus-k8s-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.854126 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.854102 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-metrics-client-certs" (OuterVolumeSpecName: "secret-metrics-client-certs") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "secret-metrics-client-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.854555 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.854517 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-thanos-sidecar-tls" (OuterVolumeSpecName: "secret-prometheus-k8s-thanos-sidecar-tls") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "secret-prometheus-k8s-thanos-sidecar-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.854683 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.854659 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-thanos-prometheus-http-client-file" (OuterVolumeSpecName: "thanos-prometheus-http-client-file") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "thanos-prometheus-http-client-file". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.855107 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.855068 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:37.855226 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.855117 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-kube-rbac-proxy" (OuterVolumeSpecName: "secret-kube-rbac-proxy") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "secret-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.855762 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.855739 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-grpc-tls" (OuterVolumeSpecName: "secret-grpc-tls") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "secret-grpc-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.855932 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.855910 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config-out" (OuterVolumeSpecName: "config-out") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:37.864495 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.864474 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-web-config" (OuterVolumeSpecName: "web-config") pod "40f7a79c-aa7b-4aad-999b-1daee200d8ab" (UID: "40f7a79c-aa7b-4aad-999b-1daee200d8ab"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:37.950905 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950876 2581 reconciler_common.go:299] "Volume detached for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-thanos-prometheus-http-client-file\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.950905 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950902 2581 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-tls-assets\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.950905 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950911 2581 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config-out\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950920 2581 reconciler_common.go:299] "Volume detached for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-configmap-kubelet-serving-ca-bundle\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950930 2581 reconciler_common.go:299] "Volume detached for volume \"config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-config\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950938 2581 reconciler_common.go:299] "Volume detached for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-metrics-client-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950947 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-db\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950956 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-k8s-rulefiles-0\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950964 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-tls\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950973 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-thanos-sidecar-tls\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950983 2581 reconciler_common.go:299] "Volume detached for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-kube-rbac-proxy\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950991 2581 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-web-config\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.950999 2581 reconciler_common.go:299] "Volume detached for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40f7a79c-aa7b-4aad-999b-1daee200d8ab-prometheus-trusted-ca-bundle\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.951008 2581 reconciler_common.go:299] "Volume detached for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-grpc-tls\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.951017 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z4bkl\" (UniqueName: \"kubernetes.io/projected/40f7a79c-aa7b-4aad-999b-1daee200d8ab-kube-api-access-z4bkl\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:37.951083 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:37.951025 2581 reconciler_common.go:299] "Volume detached for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/40f7a79c-aa7b-4aad-999b-1daee200d8ab-secret-prometheus-k8s-kube-rbac-proxy-web\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:35:38.000814 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.000786 2581 generic.go:358] "Generic (PLEG): container finished" podID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerID="1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e" exitCode=0 Apr 23 13:35:38.000934 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.000864 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e"} Apr 23 13:35:38.000934 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.000901 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"40f7a79c-aa7b-4aad-999b-1daee200d8ab","Type":"ContainerDied","Data":"0c925f69e87506c279f8ecd3c519ebe6504e604fb529ffaa9b73d4a93a791e20"} Apr 23 13:35:38.000934 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.000903 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.000934 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.000917 2581 scope.go:117] "RemoveContainer" containerID="2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2" Apr 23 13:35:38.008783 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.008764 2581 scope.go:117] "RemoveContainer" containerID="2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c" Apr 23 13:35:38.015325 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.015309 2581 scope.go:117] "RemoveContainer" containerID="1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e" Apr 23 13:35:38.021654 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.021636 2581 scope.go:117] "RemoveContainer" containerID="b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04" Apr 23 13:35:38.023668 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.023647 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:35:38.027645 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.027625 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:35:38.029675 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.029660 2581 scope.go:117] "RemoveContainer" containerID="2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91" Apr 23 13:35:38.035982 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.035966 2581 scope.go:117] "RemoveContainer" containerID="4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349" Apr 23 13:35:38.044775 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.044749 2581 scope.go:117] "RemoveContainer" containerID="888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27" Apr 23 13:35:38.051613 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.051596 2581 scope.go:117] "RemoveContainer" containerID="2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2" Apr 23 13:35:38.051930 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.051906 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2\": container with ID starting with 2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2 not found: ID does not exist" containerID="2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2" Apr 23 13:35:38.052025 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.051936 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2"} err="failed to get container status \"2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2\": rpc error: code = NotFound desc = could not find container \"2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2\": container with ID starting with 2d1e8b7ce756450ceb1c4fa0e8b902dc90d5493144ffc81c7dbaf7dfde1aeab2 not found: ID does not exist" Apr 23 13:35:38.052025 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.051970 2581 scope.go:117] "RemoveContainer" containerID="2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c" Apr 23 13:35:38.052246 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.052223 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c\": container with ID starting with 2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c not found: ID does not exist" containerID="2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c" Apr 23 13:35:38.052344 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052258 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c"} err="failed to get container status \"2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c\": rpc error: code = NotFound desc = could not find container \"2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c\": container with ID starting with 2d9441b3f710101d490740015dc02a3a1bdc49ef1782a77407903333155aa59c not found: ID does not exist" Apr 23 13:35:38.052344 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052289 2581 scope.go:117] "RemoveContainer" containerID="1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e" Apr 23 13:35:38.052536 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.052520 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e\": container with ID starting with 1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e not found: ID does not exist" containerID="1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e" Apr 23 13:35:38.052598 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052544 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e"} err="failed to get container status \"1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e\": rpc error: code = NotFound desc = could not find container \"1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e\": container with ID starting with 1ed16373bf9aa4f1c8da2708f217b2c6f0962104ab25abca780380aef6fe2e5e not found: ID does not exist" Apr 23 13:35:38.052598 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052564 2581 scope.go:117] "RemoveContainer" containerID="b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04" Apr 23 13:35:38.052695 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052639 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:35:38.052801 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.052781 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04\": container with ID starting with b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04 not found: ID does not exist" containerID="b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04" Apr 23 13:35:38.052847 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052818 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04"} err="failed to get container status \"b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04\": rpc error: code = NotFound desc = could not find container \"b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04\": container with ID starting with b1fae3181e480317d8671cd6713e89231722044b27eee822e45b0e728851ad04 not found: ID does not exist" Apr 23 13:35:38.052847 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.052833 2581 scope.go:117] "RemoveContainer" containerID="2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91" Apr 23 13:35:38.053049 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.053032 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91\": container with ID starting with 2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91 not found: ID does not exist" containerID="2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91" Apr 23 13:35:38.053100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053047 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="prometheus" Apr 23 13:35:38.053100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053064 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="prometheus" Apr 23 13:35:38.053100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053084 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-thanos" Apr 23 13:35:38.053100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053090 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-thanos" Apr 23 13:35:38.053100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053097 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053103 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053112 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="config-reloader" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053117 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="config-reloader" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053124 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-web" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053129 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-web" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053137 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="thanos-sidecar" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053141 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="thanos-sidecar" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053172 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="init-config-reloader" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053179 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="init-config-reloader" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053234 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053054 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91"} err="failed to get container status \"2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91\": rpc error: code = NotFound desc = could not find container \"2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91\": container with ID starting with 2480a2d11aec4a52a695f0c18975fb5905af8802973eeff52b48289fff583c91 not found: ID does not exist" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053242 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="thanos-sidecar" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053249 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="config-reloader" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053255 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="prometheus" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053262 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-thanos" Apr 23 13:35:38.053291 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053269 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" containerName="kube-rbac-proxy-web" Apr 23 13:35:38.053926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053253 2581 scope.go:117] "RemoveContainer" containerID="4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349" Apr 23 13:35:38.053926 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.053548 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349\": container with ID starting with 4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349 not found: ID does not exist" containerID="4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349" Apr 23 13:35:38.053926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053567 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349"} err="failed to get container status \"4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349\": rpc error: code = NotFound desc = could not find container \"4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349\": container with ID starting with 4c886128c070b771d3f8380afa4295df3ab2c882ab23b5bde532c71acaa96349 not found: ID does not exist" Apr 23 13:35:38.053926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053598 2581 scope.go:117] "RemoveContainer" containerID="888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27" Apr 23 13:35:38.053926 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.053875 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27\": container with ID starting with 888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27 not found: ID does not exist" containerID="888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27" Apr 23 13:35:38.053926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.053894 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27"} err="failed to get container status \"888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27\": rpc error: code = NotFound desc = could not find container \"888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27\": container with ID starting with 888b806ab8d4134b270d7a92775b9657eb02ef2f14bb8c4ad70d64b5cac1bc27 not found: ID does not exist" Apr 23 13:35:38.058378 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.058363 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.060849 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.060833 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-grpc-tls-3ujmhnjjekjun\"" Apr 23 13:35:38.060991 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.060974 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls-assets-0\"" Apr 23 13:35:38.060991 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.060981 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-web-config\"" Apr 23 13:35:38.061163 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.060974 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:35:38.061163 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.060981 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-tls\"" Apr 23 13:35:38.061273 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061166 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"kube-rbac-proxy\"" Apr 23 13:35:38.061438 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061417 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-kube-rbac-proxy-web\"" Apr 23 13:35:38.061438 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061435 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-prometheus-http-client-file\"" Apr 23 13:35:38.061616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061477 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-dockercfg-xmwp2\"" Apr 23 13:35:38.061616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061510 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-client-certs\"" Apr 23 13:35:38.061616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061514 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s\"" Apr 23 13:35:38.061896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.061880 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-thanos-sidecar-tls\"" Apr 23 13:35:38.062106 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.062094 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"serving-certs-ca-bundle\"" Apr 23 13:35:38.064393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.064371 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-k8s-rulefiles-0\"" Apr 23 13:35:38.073982 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.071102 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:35:38.073982 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.073867 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"prometheus-trusted-ca-bundle\"" Apr 23 13:35:38.152980 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.152949 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153130 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153018 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153130 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153039 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153130 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153070 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153142 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153195 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153230 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-web-config\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153257 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153323 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153340 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5nqnb\" (UniqueName: \"kubernetes.io/projected/a9040ede-a7ad-4c03-9f92-a21411de4988-kube-api-access-5nqnb\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153361 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153395 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153417 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153432 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153449 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-config\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153467 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153464 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9040ede-a7ad-4c03-9f92-a21411de4988-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153483 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.153713 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.153507 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9040ede-a7ad-4c03-9f92-a21411de4988-config-out\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.188915 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.188887 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="40f7a79c-aa7b-4aad-999b-1daee200d8ab" path="/var/lib/kubelet/pods/40f7a79c-aa7b-4aad-999b-1daee200d8ab/volumes" Apr 23 13:35:38.254861 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.254838 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-web-config\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.254985 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.254874 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.254985 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.254923 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255109 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255068 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5nqnb\" (UniqueName: \"kubernetes.io/projected/a9040ede-a7ad-4c03-9f92-a21411de4988-kube-api-access-5nqnb\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255191 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255108 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255191 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255181 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255304 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255223 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255304 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255248 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255304 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255274 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-config\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255304 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255301 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9040ede-a7ad-4c03-9f92-a21411de4988-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255331 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255369 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9040ede-a7ad-4c03-9f92-a21411de4988-config-out\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255403 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-db\" (UniqueName: \"kubernetes.io/empty-dir/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-k8s-db\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255416 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255457 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255485 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255803 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255511 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255803 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255543 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.255803 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.255570 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.256093 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.256036 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-kubelet-serving-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.256224 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.256097 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-metrics-client-ca\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.256550 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.256530 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-configmap-serving-certs-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.257965 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.257935 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-trusted-ca-bundle\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.258313 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.258288 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-web-config\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.258861 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.258836 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"thanos-prometheus-http-client-file\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-thanos-prometheus-http-client-file\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.259281 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.259259 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-kube-rbac-proxy-web\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.259753 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.259726 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/a9040ede-a7ad-4c03-9f92-a21411de4988-config-out\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.259753 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.259734 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-client-certs\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-metrics-client-certs\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.259889 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.259808 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-kube-rbac-proxy\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.260004 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.259978 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/a9040ede-a7ad-4c03-9f92-a21411de4988-tls-assets\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.260087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.259987 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-k8s-rulefiles-0\" (UniqueName: \"kubernetes.io/configmap/a9040ede-a7ad-4c03-9f92-a21411de4988-prometheus-k8s-rulefiles-0\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.261004 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.260988 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-thanos-sidecar-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-thanos-sidecar-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.261064 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.261041 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-prometheus-k8s-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-prometheus-k8s-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.261277 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.261261 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-secret-grpc-tls\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.261362 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.261346 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/secret/a9040ede-a7ad-4c03-9f92-a21411de4988-config\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.263449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.263432 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5nqnb\" (UniqueName: \"kubernetes.io/projected/a9040ede-a7ad-4c03-9f92-a21411de4988-kube-api-access-5nqnb\") pod \"prometheus-k8s-0\" (UID: \"a9040ede-a7ad-4c03-9f92-a21411de4988\") " pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.368505 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.368477 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:38.497680 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:38.497628 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-k8s-0"] Apr 23 13:35:38.499980 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:35:38.499955 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda9040ede_a7ad_4c03_9f92_a21411de4988.slice/crio-bcaa4ccab3bbc73964c31b8e4c7223fd5deacebca3c6c87b4f7ffd647cc862f8 WatchSource:0}: Error finding container bcaa4ccab3bbc73964c31b8e4c7223fd5deacebca3c6c87b4f7ffd647cc862f8: Status 404 returned error can't find the container with id bcaa4ccab3bbc73964c31b8e4c7223fd5deacebca3c6c87b4f7ffd647cc862f8 Apr 23 13:35:38.601337 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.601297 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-tzxth" podUID="0d8b2cf1-4023-4221-a610-b0935e9dd17c" Apr 23 13:35:38.601430 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.601341 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[networking-console-plugin-cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" podUID="26d41729-e489-4e6c-997e-ca85d3402bba" Apr 23 13:35:38.601430 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:35:38.601341 2581 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-785s4" podUID="0b4ad106-4e39-4a55-96b5-d5f06ffb38f4" Apr 23 13:35:39.005715 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:39.005677 2581 generic.go:358] "Generic (PLEG): container finished" podID="a9040ede-a7ad-4c03-9f92-a21411de4988" containerID="eea1f6cdd4c57bfafa4cb0d4cca7028b5741fe493c899316db70ecffa11482f0" exitCode=0 Apr 23 13:35:39.006146 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:39.005762 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerDied","Data":"eea1f6cdd4c57bfafa4cb0d4cca7028b5741fe493c899316db70ecffa11482f0"} Apr 23 13:35:39.006146 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:39.005792 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"bcaa4ccab3bbc73964c31b8e4c7223fd5deacebca3c6c87b4f7ffd647cc862f8"} Apr 23 13:35:39.006146 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:39.005809 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:35:39.006146 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:39.006059 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-785s4" Apr 23 13:35:39.006387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:39.006279 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:35:40.013132 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.013096 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"47b5ad3a067024cc6de7e6e38c292a928edf02b090d648eb7889d0ed3306f619"} Apr 23 13:35:40.013132 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.013133 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"3f8274247b7d801cff51a3dee88d2fd08f1ca47baa5cf3b2b51260ec1c9cd774"} Apr 23 13:35:40.013616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.013146 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"d76c644cc407f23e4c00fccfcd11eb1dab5c6eef2cba9b33f5b17074d9fe8139"} Apr 23 13:35:40.013616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.013180 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"2f0566c970f6c4ff8b94a33c071f5f09e56811ef294681fedfd8e17871844521"} Apr 23 13:35:40.013616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.013192 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"ae2999e5e73bcd9fbb478c525685921a683d2765a22a4e2a7ef92a5adb824988"} Apr 23 13:35:40.013616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.013205 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-k8s-0" event={"ID":"a9040ede-a7ad-4c03-9f92-a21411de4988","Type":"ContainerStarted","Data":"226963e9e63f604069a18330ec88354bc19c72965d34661f3d343d02317031eb"} Apr 23 13:35:40.047050 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:40.046993 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-k8s-0" podStartSLOduration=2.046974733 podStartE2EDuration="2.046974733s" podCreationTimestamp="2026-04-23 13:35:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:35:40.043958803 +0000 UTC m=+282.315888136" watchObservedRunningTime="2026-04-23 13:35:40.046974733 +0000 UTC m=+282.318904069" Apr 23 13:35:41.686101 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.686059 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:35:41.686553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.686140 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:35:41.686553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.686197 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:35:41.688638 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.688612 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/0b4ad106-4e39-4a55-96b5-d5f06ffb38f4-metrics-tls\") pod \"dns-default-785s4\" (UID: \"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4\") " pod="openshift-dns/dns-default-785s4" Apr 23 13:35:41.688790 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.688768 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"networking-console-plugin-cert\" (UniqueName: \"kubernetes.io/secret/26d41729-e489-4e6c-997e-ca85d3402bba-networking-console-plugin-cert\") pod \"networking-console-plugin-cb95c66f6-48vsk\" (UID: \"26d41729-e489-4e6c-997e-ca85d3402bba\") " pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:35:41.688882 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.688863 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/0d8b2cf1-4023-4221-a610-b0935e9dd17c-cert\") pod \"ingress-canary-tzxth\" (UID: \"0d8b2cf1-4023-4221-a610-b0935e9dd17c\") " pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:35:41.710832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.710811 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pj4z2\"" Apr 23 13:35:41.710832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.710820 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gtr2j\"" Apr 23 13:35:41.710988 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.710820 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-console\"/\"default-dockercfg-xsbtc\"" Apr 23 13:35:41.716750 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.716723 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" Apr 23 13:35:41.716750 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.716733 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-tzxth" Apr 23 13:35:41.716906 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.716756 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-785s4" Apr 23 13:35:41.876824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:41.876792 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-785s4"] Apr 23 13:35:41.879622 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:35:41.879590 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b4ad106_4e39_4a55_96b5_d5f06ffb38f4.slice/crio-218a3cc307620f62b50391d7828081f18d0fb88b9c0f051962b30bd378ef6461 WatchSource:0}: Error finding container 218a3cc307620f62b50391d7828081f18d0fb88b9c0f051962b30bd378ef6461: Status 404 returned error can't find the container with id 218a3cc307620f62b50391d7828081f18d0fb88b9c0f051962b30bd378ef6461 Apr 23 13:35:42.020943 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:42.020857 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-785s4" event={"ID":"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4","Type":"ContainerStarted","Data":"218a3cc307620f62b50391d7828081f18d0fb88b9c0f051962b30bd378ef6461"} Apr 23 13:35:42.102773 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:42.102749 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-tzxth"] Apr 23 13:35:42.105020 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:35:42.104991 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0d8b2cf1_4023_4221_a610_b0935e9dd17c.slice/crio-601251144b9217f02b32267a5882a10ab625711b1c0181cc8dd891be76150b81 WatchSource:0}: Error finding container 601251144b9217f02b32267a5882a10ab625711b1c0181cc8dd891be76150b81: Status 404 returned error can't find the container with id 601251144b9217f02b32267a5882a10ab625711b1c0181cc8dd891be76150b81 Apr 23 13:35:42.107757 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:42.107734 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-console/networking-console-plugin-cb95c66f6-48vsk"] Apr 23 13:35:42.110041 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:35:42.110020 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod26d41729_e489_4e6c_997e_ca85d3402bba.slice/crio-ad74b69d0753e8e17787debe8e7880d65ea763afee29a80e9dea03663b06c24a WatchSource:0}: Error finding container ad74b69d0753e8e17787debe8e7880d65ea763afee29a80e9dea03663b06c24a: Status 404 returned error can't find the container with id ad74b69d0753e8e17787debe8e7880d65ea763afee29a80e9dea03663b06c24a Apr 23 13:35:43.025429 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:43.025386 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" event={"ID":"26d41729-e489-4e6c-997e-ca85d3402bba","Type":"ContainerStarted","Data":"ad74b69d0753e8e17787debe8e7880d65ea763afee29a80e9dea03663b06c24a"} Apr 23 13:35:43.027025 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:43.026991 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tzxth" event={"ID":"0d8b2cf1-4023-4221-a610-b0935e9dd17c","Type":"ContainerStarted","Data":"601251144b9217f02b32267a5882a10ab625711b1c0181cc8dd891be76150b81"} Apr 23 13:35:43.369070 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:43.369034 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:35:45.036841 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.036792 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-785s4" event={"ID":"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4","Type":"ContainerStarted","Data":"5e6d573b9b19af687c28a1b6adf8b5b6540e2db77ba7e73a69dfecbb55697802"} Apr 23 13:35:45.036841 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.036842 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-785s4" event={"ID":"0b4ad106-4e39-4a55-96b5-d5f06ffb38f4","Type":"ContainerStarted","Data":"caf9868b16db1005627fbfbb7fc89d2f68dd81763434ec1e0697dd76afee9bd8"} Apr 23 13:35:45.037348 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.036927 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-785s4" Apr 23 13:35:45.038080 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.038054 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" event={"ID":"26d41729-e489-4e6c-997e-ca85d3402bba","Type":"ContainerStarted","Data":"ce17569d4462443bf6fc79438fb8643844b68c1f3a27f231a4c34b6198f4cc73"} Apr 23 13:35:45.039583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.039557 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-tzxth" event={"ID":"0d8b2cf1-4023-4221-a610-b0935e9dd17c","Type":"ContainerStarted","Data":"cc829ae7eb2f1b3e6351d69b4284be459b94df0bc51dba2a028ca511c0f92571"} Apr 23 13:35:45.054621 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.054580 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-785s4" podStartSLOduration=251.903578959 podStartE2EDuration="4m14.054567475s" podCreationTimestamp="2026-04-23 13:31:31 +0000 UTC" firstStartedPulling="2026-04-23 13:35:41.881824279 +0000 UTC m=+284.153753593" lastFinishedPulling="2026-04-23 13:35:44.032812798 +0000 UTC m=+286.304742109" observedRunningTime="2026-04-23 13:35:45.053768241 +0000 UTC m=+287.325697580" watchObservedRunningTime="2026-04-23 13:35:45.054567475 +0000 UTC m=+287.326496808" Apr 23 13:35:45.068492 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.068448 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-tzxth" podStartSLOduration=252.13982994 podStartE2EDuration="4m14.068435153s" podCreationTimestamp="2026-04-23 13:31:31 +0000 UTC" firstStartedPulling="2026-04-23 13:35:42.107329113 +0000 UTC m=+284.379258426" lastFinishedPulling="2026-04-23 13:35:44.035934327 +0000 UTC m=+286.307863639" observedRunningTime="2026-04-23 13:35:45.067202838 +0000 UTC m=+287.339132171" watchObservedRunningTime="2026-04-23 13:35:45.068435153 +0000 UTC m=+287.340364486" Apr 23 13:35:45.081331 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:45.081294 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-console/networking-console-plugin-cb95c66f6-48vsk" podStartSLOduration=263.163046501 podStartE2EDuration="4m25.081281852s" podCreationTimestamp="2026-04-23 13:31:20 +0000 UTC" firstStartedPulling="2026-04-23 13:35:42.111706237 +0000 UTC m=+284.383635549" lastFinishedPulling="2026-04-23 13:35:44.029941587 +0000 UTC m=+286.301870900" observedRunningTime="2026-04-23 13:35:45.080875206 +0000 UTC m=+287.352804542" watchObservedRunningTime="2026-04-23 13:35:45.081281852 +0000 UTC m=+287.353211187" Apr 23 13:35:55.044965 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:55.044937 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-785s4" Apr 23 13:35:58.146908 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:58.146885 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:35:58.147326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:58.147302 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:35:58.152211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:35:58.152190 2581 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:36:38.369687 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:36:38.369648 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:38.385797 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:36:38.385773 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:36:39.221098 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:36:39.221071 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-k8s-0" Apr 23 13:39:26.397138 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.397104 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-lbcz5"] Apr 23 13:39:26.400351 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.400332 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.403327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.403303 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-cainjector-dockercfg-tk7zh\"" Apr 23 13:39:26.404241 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.404227 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 13:39:26.404323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.404257 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 13:39:26.410797 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.410772 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-lbcz5"] Apr 23 13:39:26.479286 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.479244 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68163b2c-f768-41c2-a8ec-f7d895cd8af6-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-lbcz5\" (UID: \"68163b2c-f768-41c2-a8ec-f7d895cd8af6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.479286 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.479288 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rr4hb\" (UniqueName: \"kubernetes.io/projected/68163b2c-f768-41c2-a8ec-f7d895cd8af6-kube-api-access-rr4hb\") pod \"cert-manager-cainjector-68b757865b-lbcz5\" (UID: \"68163b2c-f768-41c2-a8ec-f7d895cd8af6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.580145 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.580109 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68163b2c-f768-41c2-a8ec-f7d895cd8af6-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-lbcz5\" (UID: \"68163b2c-f768-41c2-a8ec-f7d895cd8af6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.580349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.580171 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rr4hb\" (UniqueName: \"kubernetes.io/projected/68163b2c-f768-41c2-a8ec-f7d895cd8af6-kube-api-access-rr4hb\") pod \"cert-manager-cainjector-68b757865b-lbcz5\" (UID: \"68163b2c-f768-41c2-a8ec-f7d895cd8af6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.593487 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.593459 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rr4hb\" (UniqueName: \"kubernetes.io/projected/68163b2c-f768-41c2-a8ec-f7d895cd8af6-kube-api-access-rr4hb\") pod \"cert-manager-cainjector-68b757865b-lbcz5\" (UID: \"68163b2c-f768-41c2-a8ec-f7d895cd8af6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.593614 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.593517 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/68163b2c-f768-41c2-a8ec-f7d895cd8af6-bound-sa-token\") pod \"cert-manager-cainjector-68b757865b-lbcz5\" (UID: \"68163b2c-f768-41c2-a8ec-f7d895cd8af6\") " pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.725060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.724973 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" Apr 23 13:39:26.756938 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.756893 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-g84bx"] Apr 23 13:39:26.761334 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.761308 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:26.763868 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.763849 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-webhook-dockercfg-hfrgl\"" Apr 23 13:39:26.772602 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.772573 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-g84bx"] Apr 23 13:39:26.782025 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.781994 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a50b2ec-c813-43c4-ba9c-6bde4645a307-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-g84bx\" (UID: \"8a50b2ec-c813-43c4-ba9c-6bde4645a307\") " pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:26.782190 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.782059 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p4fsd\" (UniqueName: \"kubernetes.io/projected/8a50b2ec-c813-43c4-ba9c-6bde4645a307-kube-api-access-p4fsd\") pod \"cert-manager-webhook-587ccfb98-g84bx\" (UID: \"8a50b2ec-c813-43c4-ba9c-6bde4645a307\") " pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:26.857603 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.857518 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-cainjector-68b757865b-lbcz5"] Apr 23 13:39:26.860577 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:39:26.860551 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod68163b2c_f768_41c2_a8ec_f7d895cd8af6.slice/crio-2c0ddc69bc8d99414b84e13cc502baadf57de3b593b1a7bc3f78a39206a9bee1 WatchSource:0}: Error finding container 2c0ddc69bc8d99414b84e13cc502baadf57de3b593b1a7bc3f78a39206a9bee1: Status 404 returned error can't find the container with id 2c0ddc69bc8d99414b84e13cc502baadf57de3b593b1a7bc3f78a39206a9bee1 Apr 23 13:39:26.862741 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.862725 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:39:26.883270 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.883231 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a50b2ec-c813-43c4-ba9c-6bde4645a307-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-g84bx\" (UID: \"8a50b2ec-c813-43c4-ba9c-6bde4645a307\") " pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:26.883270 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.883267 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p4fsd\" (UniqueName: \"kubernetes.io/projected/8a50b2ec-c813-43c4-ba9c-6bde4645a307-kube-api-access-p4fsd\") pod \"cert-manager-webhook-587ccfb98-g84bx\" (UID: \"8a50b2ec-c813-43c4-ba9c-6bde4645a307\") " pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:26.894750 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.894721 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/8a50b2ec-c813-43c4-ba9c-6bde4645a307-bound-sa-token\") pod \"cert-manager-webhook-587ccfb98-g84bx\" (UID: \"8a50b2ec-c813-43c4-ba9c-6bde4645a307\") " pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:26.894923 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:26.894802 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p4fsd\" (UniqueName: \"kubernetes.io/projected/8a50b2ec-c813-43c4-ba9c-6bde4645a307-kube-api-access-p4fsd\") pod \"cert-manager-webhook-587ccfb98-g84bx\" (UID: \"8a50b2ec-c813-43c4-ba9c-6bde4645a307\") " pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:27.074238 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:27.074129 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:27.200756 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:27.200733 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-webhook-587ccfb98-g84bx"] Apr 23 13:39:27.203326 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:39:27.203296 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8a50b2ec_c813_43c4_ba9c_6bde4645a307.slice/crio-baea7cceeb8381c5447bbd107e5b555affe0802b91a2326486f40ff178873e15 WatchSource:0}: Error finding container baea7cceeb8381c5447bbd107e5b555affe0802b91a2326486f40ff178873e15: Status 404 returned error can't find the container with id baea7cceeb8381c5447bbd107e5b555affe0802b91a2326486f40ff178873e15 Apr 23 13:39:27.690414 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:27.690376 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" event={"ID":"8a50b2ec-c813-43c4-ba9c-6bde4645a307","Type":"ContainerStarted","Data":"baea7cceeb8381c5447bbd107e5b555affe0802b91a2326486f40ff178873e15"} Apr 23 13:39:27.692006 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:27.691966 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" event={"ID":"68163b2c-f768-41c2-a8ec-f7d895cd8af6","Type":"ContainerStarted","Data":"2c0ddc69bc8d99414b84e13cc502baadf57de3b593b1a7bc3f78a39206a9bee1"} Apr 23 13:39:30.704302 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:30.704255 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" event={"ID":"8a50b2ec-c813-43c4-ba9c-6bde4645a307","Type":"ContainerStarted","Data":"6a7f93b5d03373fb2a64912085bf39e892f323ea764f98481a45546845587550"} Apr 23 13:39:30.704767 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:30.704514 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:30.705581 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:30.705558 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" event={"ID":"68163b2c-f768-41c2-a8ec-f7d895cd8af6","Type":"ContainerStarted","Data":"6dce91622e7e6b2cb2c7e10e203f180b188edf5bee6efd06aec4d9570772238b"} Apr 23 13:39:30.731654 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:30.731600 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" podStartSLOduration=1.648859119 podStartE2EDuration="4.731583131s" podCreationTimestamp="2026-04-23 13:39:26 +0000 UTC" firstStartedPulling="2026-04-23 13:39:27.205202992 +0000 UTC m=+509.477132303" lastFinishedPulling="2026-04-23 13:39:30.287927003 +0000 UTC m=+512.559856315" observedRunningTime="2026-04-23 13:39:30.730898242 +0000 UTC m=+513.002827577" watchObservedRunningTime="2026-04-23 13:39:30.731583131 +0000 UTC m=+513.003512467" Apr 23 13:39:30.757310 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:30.757244 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-cainjector-68b757865b-lbcz5" podStartSLOduration=1.3428201149999999 podStartE2EDuration="4.757223695s" podCreationTimestamp="2026-04-23 13:39:26 +0000 UTC" firstStartedPulling="2026-04-23 13:39:26.862864464 +0000 UTC m=+509.134793777" lastFinishedPulling="2026-04-23 13:39:30.277268045 +0000 UTC m=+512.549197357" observedRunningTime="2026-04-23 13:39:30.756493078 +0000 UTC m=+513.028422411" watchObservedRunningTime="2026-04-23 13:39:30.757223695 +0000 UTC m=+513.029153030" Apr 23 13:39:36.710719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:36.710688 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="cert-manager/cert-manager-webhook-587ccfb98-g84bx" Apr 23 13:39:39.766646 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.766562 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6"] Apr 23 13:39:39.770190 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.770172 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:39.773929 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.773899 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 13:39:39.774027 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.773949 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:39:39.774960 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.774941 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-p6n76\"" Apr 23 13:39:39.790951 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.790928 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6"] Apr 23 13:39:39.898267 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.898233 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wswsm\" (UniqueName: \"kubernetes.io/projected/493aaa1c-539a-4784-9d88-fb684ffb4971-kube-api-access-wswsm\") pod \"openshift-lws-operator-bfc7f696d-5w2b6\" (UID: \"493aaa1c-539a-4784-9d88-fb684ffb4971\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:39.898267 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.898274 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/493aaa1c-539a-4784-9d88-fb684ffb4971-tmp\") pod \"openshift-lws-operator-bfc7f696d-5w2b6\" (UID: \"493aaa1c-539a-4784-9d88-fb684ffb4971\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:39.999553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.999509 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wswsm\" (UniqueName: \"kubernetes.io/projected/493aaa1c-539a-4784-9d88-fb684ffb4971-kube-api-access-wswsm\") pod \"openshift-lws-operator-bfc7f696d-5w2b6\" (UID: \"493aaa1c-539a-4784-9d88-fb684ffb4971\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:39.999553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.999554 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/493aaa1c-539a-4784-9d88-fb684ffb4971-tmp\") pod \"openshift-lws-operator-bfc7f696d-5w2b6\" (UID: \"493aaa1c-539a-4784-9d88-fb684ffb4971\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:39.999930 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:39.999913 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/493aaa1c-539a-4784-9d88-fb684ffb4971-tmp\") pod \"openshift-lws-operator-bfc7f696d-5w2b6\" (UID: \"493aaa1c-539a-4784-9d88-fb684ffb4971\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:40.009147 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:40.009114 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wswsm\" (UniqueName: \"kubernetes.io/projected/493aaa1c-539a-4784-9d88-fb684ffb4971-kube-api-access-wswsm\") pod \"openshift-lws-operator-bfc7f696d-5w2b6\" (UID: \"493aaa1c-539a-4784-9d88-fb684ffb4971\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:40.079719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:40.079630 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" Apr 23 13:39:40.229499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:40.229473 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6"] Apr 23 13:39:40.231628 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:39:40.231599 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod493aaa1c_539a_4784_9d88_fb684ffb4971.slice/crio-93071b263d7548b95287925fe72d2ec86e6e7778ce561c924c68a70780946db4 WatchSource:0}: Error finding container 93071b263d7548b95287925fe72d2ec86e6e7778ce561c924c68a70780946db4: Status 404 returned error can't find the container with id 93071b263d7548b95287925fe72d2ec86e6e7778ce561c924c68a70780946db4 Apr 23 13:39:40.737957 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:40.737916 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" event={"ID":"493aaa1c-539a-4784-9d88-fb684ffb4971","Type":"ContainerStarted","Data":"93071b263d7548b95287925fe72d2ec86e6e7778ce561c924c68a70780946db4"} Apr 23 13:39:42.745828 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:42.745790 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" event={"ID":"493aaa1c-539a-4784-9d88-fb684ffb4971","Type":"ContainerStarted","Data":"62599636354a3859c9eb1b97db096c9338c2ae6f53f61befb4e7a7a2d4df85f3"} Apr 23 13:39:42.775266 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:42.775201 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-5w2b6" podStartSLOduration=1.411304612 podStartE2EDuration="3.775185218s" podCreationTimestamp="2026-04-23 13:39:39 +0000 UTC" firstStartedPulling="2026-04-23 13:39:40.233005292 +0000 UTC m=+522.504934603" lastFinishedPulling="2026-04-23 13:39:42.596885889 +0000 UTC m=+524.868815209" observedRunningTime="2026-04-23 13:39:42.771758205 +0000 UTC m=+525.043687538" watchObservedRunningTime="2026-04-23 13:39:42.775185218 +0000 UTC m=+525.047114552" Apr 23 13:39:50.472477 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.472437 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-868f457486-lrj4h"] Apr 23 13:39:50.475958 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.475936 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.481820 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.480906 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bzhcc\"" Apr 23 13:39:50.481820 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.481142 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 13:39:50.481820 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.481371 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 13:39:50.481820 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.481624 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 13:39:50.492064 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.492041 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-868f457486-lrj4h"] Apr 23 13:39:50.592393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.592354 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nbnnr\" (UniqueName: \"kubernetes.io/projected/111f2caf-19d2-44c6-ba31-60a71923b291-kube-api-access-nbnnr\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.592576 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.592405 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/111f2caf-19d2-44c6-ba31-60a71923b291-cert\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.592576 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.592457 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/111f2caf-19d2-44c6-ba31-60a71923b291-manager-config\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.592576 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.592476 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/111f2caf-19d2-44c6-ba31-60a71923b291-metrics-cert\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.693488 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.693446 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nbnnr\" (UniqueName: \"kubernetes.io/projected/111f2caf-19d2-44c6-ba31-60a71923b291-kube-api-access-nbnnr\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.693650 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.693504 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/111f2caf-19d2-44c6-ba31-60a71923b291-cert\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.693697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.693676 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/111f2caf-19d2-44c6-ba31-60a71923b291-manager-config\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.693733 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.693715 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/111f2caf-19d2-44c6-ba31-60a71923b291-metrics-cert\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.694419 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.694396 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/111f2caf-19d2-44c6-ba31-60a71923b291-manager-config\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.696334 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.696310 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/111f2caf-19d2-44c6-ba31-60a71923b291-metrics-cert\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.696423 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.696310 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/111f2caf-19d2-44c6-ba31-60a71923b291-cert\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.709277 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.709255 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nbnnr\" (UniqueName: \"kubernetes.io/projected/111f2caf-19d2-44c6-ba31-60a71923b291-kube-api-access-nbnnr\") pod \"lws-controller-manager-868f457486-lrj4h\" (UID: \"111f2caf-19d2-44c6-ba31-60a71923b291\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.789147 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.789010 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:50.952393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:50.952368 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-868f457486-lrj4h"] Apr 23 13:39:50.954911 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:39:50.954875 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod111f2caf_19d2_44c6_ba31_60a71923b291.slice/crio-5846aa3a94856a7b841c856abb0e0c2570fb76679d6fa52916f3f95991cf4418 WatchSource:0}: Error finding container 5846aa3a94856a7b841c856abb0e0c2570fb76679d6fa52916f3f95991cf4418: Status 404 returned error can't find the container with id 5846aa3a94856a7b841c856abb0e0c2570fb76679d6fa52916f3f95991cf4418 Apr 23 13:39:51.774823 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:51.774785 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" event={"ID":"111f2caf-19d2-44c6-ba31-60a71923b291","Type":"ContainerStarted","Data":"5846aa3a94856a7b841c856abb0e0c2570fb76679d6fa52916f3f95991cf4418"} Apr 23 13:39:52.779341 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:52.779299 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" event={"ID":"111f2caf-19d2-44c6-ba31-60a71923b291","Type":"ContainerStarted","Data":"11a9ea9f28952dd33b733ea46eea70b73efaf096f45c7a731dac5bc5728151a6"} Apr 23 13:39:52.779341 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:52.779350 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:39:52.800484 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:39:52.800436 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" podStartSLOduration=1.577624715 podStartE2EDuration="2.800416237s" podCreationTimestamp="2026-04-23 13:39:50 +0000 UTC" firstStartedPulling="2026-04-23 13:39:50.956906525 +0000 UTC m=+533.228835837" lastFinishedPulling="2026-04-23 13:39:52.179698047 +0000 UTC m=+534.451627359" observedRunningTime="2026-04-23 13:39:52.799271612 +0000 UTC m=+535.071200939" watchObservedRunningTime="2026-04-23 13:39:52.800416237 +0000 UTC m=+535.072345571" Apr 23 13:40:03.787278 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:03.787245 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-868f457486-lrj4h" Apr 23 13:40:51.483613 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.483574 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm"] Apr 23 13:40:51.486832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.486814 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:40:51.489327 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.489303 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 13:40:51.490208 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.490190 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 13:40:51.490297 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.490190 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"limitador-operator-controller-manager-dockercfg-5f447\"" Apr 23 13:40:51.498305 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.498280 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm"] Apr 23 13:40:51.617197 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.617132 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zjkvc\" (UniqueName: \"kubernetes.io/projected/45563efb-424c-493a-b947-946985c787f6-kube-api-access-zjkvc\") pod \"limitador-operator-controller-manager-c7fb4c8d5-tkjgm\" (UID: \"45563efb-424c-493a-b947-946985c787f6\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:40:51.717970 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.717934 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zjkvc\" (UniqueName: \"kubernetes.io/projected/45563efb-424c-493a-b947-946985c787f6-kube-api-access-zjkvc\") pod \"limitador-operator-controller-manager-c7fb4c8d5-tkjgm\" (UID: \"45563efb-424c-493a-b947-946985c787f6\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:40:51.734535 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.734465 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zjkvc\" (UniqueName: \"kubernetes.io/projected/45563efb-424c-493a-b947-946985c787f6-kube-api-access-zjkvc\") pod \"limitador-operator-controller-manager-c7fb4c8d5-tkjgm\" (UID: \"45563efb-424c-493a-b947-946985c787f6\") " pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:40:51.798125 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.798084 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:40:51.927053 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:51.927019 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm"] Apr 23 13:40:51.931650 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:40:51.931612 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod45563efb_424c_493a_b947_946985c787f6.slice/crio-974a7c7007c6a3cc50a1f1bb02ca314f750b5d4b5249c2dc154c4831371be60d WatchSource:0}: Error finding container 974a7c7007c6a3cc50a1f1bb02ca314f750b5d4b5249c2dc154c4831371be60d: Status 404 returned error can't find the container with id 974a7c7007c6a3cc50a1f1bb02ca314f750b5d4b5249c2dc154c4831371be60d Apr 23 13:40:52.010031 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:52.009949 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" event={"ID":"45563efb-424c-493a-b947-946985c787f6","Type":"ContainerStarted","Data":"974a7c7007c6a3cc50a1f1bb02ca314f750b5d4b5249c2dc154c4831371be60d"} Apr 23 13:40:55.022729 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:55.022689 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" event={"ID":"45563efb-424c-493a-b947-946985c787f6","Type":"ContainerStarted","Data":"0905b67f12ad86183bbee4574349610f7db34feb92c7c13fd53bba4d8de7ff57"} Apr 23 13:40:55.023116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:55.022836 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:40:55.042115 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:55.042068 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" podStartSLOduration=1.535335684 podStartE2EDuration="4.042052428s" podCreationTimestamp="2026-04-23 13:40:51 +0000 UTC" firstStartedPulling="2026-04-23 13:40:51.933656816 +0000 UTC m=+594.205586129" lastFinishedPulling="2026-04-23 13:40:54.440373559 +0000 UTC m=+596.712302873" observedRunningTime="2026-04-23 13:40:55.040803156 +0000 UTC m=+597.312732489" watchObservedRunningTime="2026-04-23 13:40:55.042052428 +0000 UTC m=+597.313981761" Apr 23 13:40:58.172448 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:58.172411 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:40:58.173511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:40:58.173481 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:41:06.028381 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:06.028338 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-operator-controller-manager-c7fb4c8d5-tkjgm" Apr 23 13:41:38.581849 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.581814 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:38.585389 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.585366 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.587856 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.587830 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-g7kfd\"" Apr 23 13:41:38.587856 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.587828 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"limitador-limits-config-limitador\"" Apr 23 13:41:38.592993 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.592834 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:38.689313 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.689279 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:38.741342 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.741309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-j7h7f\" (UniqueName: \"kubernetes.io/projected/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-kube-api-access-j7h7f\") pod \"limitador-limitador-64c8f475fb-fq92g\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.741513 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.741399 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-config-file\") pod \"limitador-limitador-64c8f475fb-fq92g\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.842607 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.842510 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-j7h7f\" (UniqueName: \"kubernetes.io/projected/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-kube-api-access-j7h7f\") pod \"limitador-limitador-64c8f475fb-fq92g\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.842607 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.842598 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-config-file\") pod \"limitador-limitador-64c8f475fb-fq92g\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.843258 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.843236 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-config-file\") pod \"limitador-limitador-64c8f475fb-fq92g\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.851294 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.851268 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-j7h7f\" (UniqueName: \"kubernetes.io/projected/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-kube-api-access-j7h7f\") pod \"limitador-limitador-64c8f475fb-fq92g\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:38.896879 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:38.896850 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:39.024416 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:39.024378 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:39.027544 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:41:39.027513 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod959af2a6_c0ac_4d8b_90ad_e7907efda7fb.slice/crio-6ae92ac68d90321b37da4ed81f2c43ebd8661e7737067e7329c50a5db9a88909 WatchSource:0}: Error finding container 6ae92ac68d90321b37da4ed81f2c43ebd8661e7737067e7329c50a5db9a88909: Status 404 returned error can't find the container with id 6ae92ac68d90321b37da4ed81f2c43ebd8661e7737067e7329c50a5db9a88909 Apr 23 13:41:39.177667 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:39.177636 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" event={"ID":"959af2a6-c0ac-4d8b-90ad-e7907efda7fb","Type":"ContainerStarted","Data":"6ae92ac68d90321b37da4ed81f2c43ebd8661e7737067e7329c50a5db9a88909"} Apr 23 13:41:43.194443 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:43.194400 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" event={"ID":"959af2a6-c0ac-4d8b-90ad-e7907efda7fb","Type":"ContainerStarted","Data":"1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650"} Apr 23 13:41:43.194903 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:43.194477 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:43.213989 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:43.213927 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" podStartSLOduration=1.270853309 podStartE2EDuration="5.213910968s" podCreationTimestamp="2026-04-23 13:41:38 +0000 UTC" firstStartedPulling="2026-04-23 13:41:39.029444393 +0000 UTC m=+641.301373707" lastFinishedPulling="2026-04-23 13:41:42.972502053 +0000 UTC m=+645.244431366" observedRunningTime="2026-04-23 13:41:43.212618719 +0000 UTC m=+645.484548055" watchObservedRunningTime="2026-04-23 13:41:43.213910968 +0000 UTC m=+645.485840302" Apr 23 13:41:54.199277 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:54.199244 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:55.422707 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:55.422671 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:55.423097 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:55.422888 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" podUID="959af2a6-c0ac-4d8b-90ad-e7907efda7fb" containerName="limitador" containerID="cri-o://1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650" gracePeriod=30 Apr 23 13:41:55.963005 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:55.962983 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:56.097330 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.097230 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-j7h7f\" (UniqueName: \"kubernetes.io/projected/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-kube-api-access-j7h7f\") pod \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " Apr 23 13:41:56.097330 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.097307 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-config-file\") pod \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\" (UID: \"959af2a6-c0ac-4d8b-90ad-e7907efda7fb\") " Apr 23 13:41:56.097665 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.097629 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-config-file" (OuterVolumeSpecName: "config-file") pod "959af2a6-c0ac-4d8b-90ad-e7907efda7fb" (UID: "959af2a6-c0ac-4d8b-90ad-e7907efda7fb"). InnerVolumeSpecName "config-file". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:41:56.099629 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.099596 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-kube-api-access-j7h7f" (OuterVolumeSpecName: "kube-api-access-j7h7f") pod "959af2a6-c0ac-4d8b-90ad-e7907efda7fb" (UID: "959af2a6-c0ac-4d8b-90ad-e7907efda7fb"). InnerVolumeSpecName "kube-api-access-j7h7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:41:56.198166 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.198127 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-j7h7f\" (UniqueName: \"kubernetes.io/projected/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-kube-api-access-j7h7f\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:41:56.198274 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.198173 2581 reconciler_common.go:299] "Volume detached for volume \"config-file\" (UniqueName: \"kubernetes.io/configmap/959af2a6-c0ac-4d8b-90ad-e7907efda7fb-config-file\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:41:56.240011 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.239973 2581 generic.go:358] "Generic (PLEG): container finished" podID="959af2a6-c0ac-4d8b-90ad-e7907efda7fb" containerID="1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650" exitCode=0 Apr 23 13:41:56.240195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.240021 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" event={"ID":"959af2a6-c0ac-4d8b-90ad-e7907efda7fb","Type":"ContainerDied","Data":"1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650"} Apr 23 13:41:56.240195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.240041 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" Apr 23 13:41:56.240195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.240064 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/limitador-limitador-64c8f475fb-fq92g" event={"ID":"959af2a6-c0ac-4d8b-90ad-e7907efda7fb","Type":"ContainerDied","Data":"6ae92ac68d90321b37da4ed81f2c43ebd8661e7737067e7329c50a5db9a88909"} Apr 23 13:41:56.240195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.240084 2581 scope.go:117] "RemoveContainer" containerID="1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650" Apr 23 13:41:56.248459 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.248439 2581 scope.go:117] "RemoveContainer" containerID="1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650" Apr 23 13:41:56.248741 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:41:56.248718 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650\": container with ID starting with 1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650 not found: ID does not exist" containerID="1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650" Apr 23 13:41:56.248802 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.248753 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650"} err="failed to get container status \"1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650\": rpc error: code = NotFound desc = could not find container \"1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650\": container with ID starting with 1c97d232732a4565d3849438d5c0b33c6cb210881a851f3e469905b6ea232650 not found: ID does not exist" Apr 23 13:41:56.256024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.255999 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:56.259336 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:56.259314 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/limitador-limitador-64c8f475fb-fq92g"] Apr 23 13:41:58.187848 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:41:58.187810 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="959af2a6-c0ac-4d8b-90ad-e7907efda7fb" path="/var/lib/kubelet/pods/959af2a6-c0ac-4d8b-90ad-e7907efda7fb/volumes" Apr 23 13:42:24.096847 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.096810 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-pv8t4"] Apr 23 13:42:24.097407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.097239 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="959af2a6-c0ac-4d8b-90ad-e7907efda7fb" containerName="limitador" Apr 23 13:42:24.097407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.097252 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="959af2a6-c0ac-4d8b-90ad-e7907efda7fb" containerName="limitador" Apr 23 13:42:24.097407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.097334 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="959af2a6-c0ac-4d8b-90ad-e7907efda7fb" containerName="limitador" Apr 23 13:42:24.112203 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.112170 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8"] Apr 23 13:42:24.112383 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.112288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.115209 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.115182 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-webhook-server-cert\"" Apr 23 13:42:24.115357 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.115216 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:42:24.115357 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.115243 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:42:24.115357 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.115286 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"kserve-controller-manager-dockercfg-kd5b6\"" Apr 23 13:42:24.115497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.115430 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-pv8t4"] Apr 23 13:42:24.115534 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.115525 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.117722 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.117704 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-webhook-server-cert\"" Apr 23 13:42:24.117850 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.117791 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"llmisvc-controller-manager-dockercfg-8g2fl\"" Apr 23 13:42:24.124064 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.124041 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8"] Apr 23 13:42:24.248334 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.248301 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12daf06-e20e-4c18-b2f5-a567db50fa8f-cert\") pod \"llmisvc-controller-manager-595fbbc8cc-tdjd8\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.248492 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.248339 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqlmp\" (UniqueName: \"kubernetes.io/projected/11469ae5-954f-46a8-b8db-8eefbdccfbf6-kube-api-access-sqlmp\") pod \"kserve-controller-manager-6b667fdd66-pv8t4\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.248492 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.248365 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11469ae5-954f-46a8-b8db-8eefbdccfbf6-cert\") pod \"kserve-controller-manager-6b667fdd66-pv8t4\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.248566 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.248480 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f57c9\" (UniqueName: \"kubernetes.io/projected/b12daf06-e20e-4c18-b2f5-a567db50fa8f-kube-api-access-f57c9\") pod \"llmisvc-controller-manager-595fbbc8cc-tdjd8\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.349896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.349815 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f57c9\" (UniqueName: \"kubernetes.io/projected/b12daf06-e20e-4c18-b2f5-a567db50fa8f-kube-api-access-f57c9\") pod \"llmisvc-controller-manager-595fbbc8cc-tdjd8\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.349896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.349891 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12daf06-e20e-4c18-b2f5-a567db50fa8f-cert\") pod \"llmisvc-controller-manager-595fbbc8cc-tdjd8\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.350100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.349912 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqlmp\" (UniqueName: \"kubernetes.io/projected/11469ae5-954f-46a8-b8db-8eefbdccfbf6-kube-api-access-sqlmp\") pod \"kserve-controller-manager-6b667fdd66-pv8t4\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.350100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.349932 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11469ae5-954f-46a8-b8db-8eefbdccfbf6-cert\") pod \"kserve-controller-manager-6b667fdd66-pv8t4\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.352560 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.352536 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12daf06-e20e-4c18-b2f5-a567db50fa8f-cert\") pod \"llmisvc-controller-manager-595fbbc8cc-tdjd8\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.352666 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.352619 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11469ae5-954f-46a8-b8db-8eefbdccfbf6-cert\") pod \"kserve-controller-manager-6b667fdd66-pv8t4\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.358063 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.358033 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqlmp\" (UniqueName: \"kubernetes.io/projected/11469ae5-954f-46a8-b8db-8eefbdccfbf6-kube-api-access-sqlmp\") pod \"kserve-controller-manager-6b667fdd66-pv8t4\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.358219 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.358198 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f57c9\" (UniqueName: \"kubernetes.io/projected/b12daf06-e20e-4c18-b2f5-a567db50fa8f-kube-api-access-f57c9\") pod \"llmisvc-controller-manager-595fbbc8cc-tdjd8\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.425525 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.425487 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:24.432312 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.432280 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:24.567902 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.567876 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-pv8t4"] Apr 23 13:42:24.570430 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:42:24.570402 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod11469ae5_954f_46a8_b8db_8eefbdccfbf6.slice/crio-79c51ef72961179431a5b47f3c864cef0bfc5281eaccd215068d2a9176b6c908 WatchSource:0}: Error finding container 79c51ef72961179431a5b47f3c864cef0bfc5281eaccd215068d2a9176b6c908: Status 404 returned error can't find the container with id 79c51ef72961179431a5b47f3c864cef0bfc5281eaccd215068d2a9176b6c908 Apr 23 13:42:24.596386 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:24.596363 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8"] Apr 23 13:42:24.598671 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:42:24.598640 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podb12daf06_e20e_4c18_b2f5_a567db50fa8f.slice/crio-d5bde48162dcd5d113a1b6a214451577df7b33c76858e759d1e0e8c9feab3c0d WatchSource:0}: Error finding container d5bde48162dcd5d113a1b6a214451577df7b33c76858e759d1e0e8c9feab3c0d: Status 404 returned error can't find the container with id d5bde48162dcd5d113a1b6a214451577df7b33c76858e759d1e0e8c9feab3c0d Apr 23 13:42:25.341214 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:25.341178 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" event={"ID":"11469ae5-954f-46a8-b8db-8eefbdccfbf6","Type":"ContainerStarted","Data":"79c51ef72961179431a5b47f3c864cef0bfc5281eaccd215068d2a9176b6c908"} Apr 23 13:42:25.342223 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:25.342194 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" event={"ID":"b12daf06-e20e-4c18-b2f5-a567db50fa8f","Type":"ContainerStarted","Data":"d5bde48162dcd5d113a1b6a214451577df7b33c76858e759d1e0e8c9feab3c0d"} Apr 23 13:42:28.357478 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:28.357447 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" event={"ID":"11469ae5-954f-46a8-b8db-8eefbdccfbf6","Type":"ContainerStarted","Data":"21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b"} Apr 23 13:42:28.357826 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:28.357593 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:42:28.377794 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:28.377725 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" podStartSLOduration=0.679305975 podStartE2EDuration="4.377704826s" podCreationTimestamp="2026-04-23 13:42:24 +0000 UTC" firstStartedPulling="2026-04-23 13:42:24.571624217 +0000 UTC m=+686.843553529" lastFinishedPulling="2026-04-23 13:42:28.270023066 +0000 UTC m=+690.541952380" observedRunningTime="2026-04-23 13:42:28.37632028 +0000 UTC m=+690.648249628" watchObservedRunningTime="2026-04-23 13:42:28.377704826 +0000 UTC m=+690.649634163" Apr 23 13:42:29.362142 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:29.362106 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" event={"ID":"b12daf06-e20e-4c18-b2f5-a567db50fa8f","Type":"ContainerStarted","Data":"956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557"} Apr 23 13:42:29.362661 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:29.362198 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:42:29.385551 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:29.385497 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" podStartSLOduration=1.660833078 podStartE2EDuration="5.385480992s" podCreationTimestamp="2026-04-23 13:42:24 +0000 UTC" firstStartedPulling="2026-04-23 13:42:24.599916044 +0000 UTC m=+686.871845356" lastFinishedPulling="2026-04-23 13:42:28.324563956 +0000 UTC m=+690.596493270" observedRunningTime="2026-04-23 13:42:29.383359919 +0000 UTC m=+691.655289252" watchObservedRunningTime="2026-04-23 13:42:29.385480992 +0000 UTC m=+691.657410325" Apr 23 13:42:59.367698 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:42:59.367610 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:43:00.368517 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:00.368480 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:43:01.980205 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:01.980127 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-pv8t4"] Apr 23 13:43:01.980636 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:01.980421 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" podUID="11469ae5-954f-46a8-b8db-8eefbdccfbf6" containerName="manager" containerID="cri-o://21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b" gracePeriod=10 Apr 23 13:43:02.006490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.006460 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-srkvt"] Apr 23 13:43:02.058807 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.058768 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-srkvt"] Apr 23 13:43:02.058955 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.058894 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.196864 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.196832 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t5qln\" (UniqueName: \"kubernetes.io/projected/474680a9-b7bc-4845-94e6-6cf0a462a53b-kube-api-access-t5qln\") pod \"kserve-controller-manager-6b667fdd66-srkvt\" (UID: \"474680a9-b7bc-4845-94e6-6cf0a462a53b\") " pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.197032 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.196890 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/474680a9-b7bc-4845-94e6-6cf0a462a53b-cert\") pod \"kserve-controller-manager-6b667fdd66-srkvt\" (UID: \"474680a9-b7bc-4845-94e6-6cf0a462a53b\") " pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.235408 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.235341 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:43:02.297996 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.297961 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t5qln\" (UniqueName: \"kubernetes.io/projected/474680a9-b7bc-4845-94e6-6cf0a462a53b-kube-api-access-t5qln\") pod \"kserve-controller-manager-6b667fdd66-srkvt\" (UID: \"474680a9-b7bc-4845-94e6-6cf0a462a53b\") " pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.298202 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.298009 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/474680a9-b7bc-4845-94e6-6cf0a462a53b-cert\") pod \"kserve-controller-manager-6b667fdd66-srkvt\" (UID: \"474680a9-b7bc-4845-94e6-6cf0a462a53b\") " pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.300709 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.300684 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/474680a9-b7bc-4845-94e6-6cf0a462a53b-cert\") pod \"kserve-controller-manager-6b667fdd66-srkvt\" (UID: \"474680a9-b7bc-4845-94e6-6cf0a462a53b\") " pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.306479 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.306452 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t5qln\" (UniqueName: \"kubernetes.io/projected/474680a9-b7bc-4845-94e6-6cf0a462a53b-kube-api-access-t5qln\") pod \"kserve-controller-manager-6b667fdd66-srkvt\" (UID: \"474680a9-b7bc-4845-94e6-6cf0a462a53b\") " pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.399256 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.399216 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11469ae5-954f-46a8-b8db-8eefbdccfbf6-cert\") pod \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " Apr 23 13:43:02.399452 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.399299 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqlmp\" (UniqueName: \"kubernetes.io/projected/11469ae5-954f-46a8-b8db-8eefbdccfbf6-kube-api-access-sqlmp\") pod \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\" (UID: \"11469ae5-954f-46a8-b8db-8eefbdccfbf6\") " Apr 23 13:43:02.401602 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.401574 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/11469ae5-954f-46a8-b8db-8eefbdccfbf6-cert" (OuterVolumeSpecName: "cert") pod "11469ae5-954f-46a8-b8db-8eefbdccfbf6" (UID: "11469ae5-954f-46a8-b8db-8eefbdccfbf6"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:43:02.401707 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.401630 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/11469ae5-954f-46a8-b8db-8eefbdccfbf6-kube-api-access-sqlmp" (OuterVolumeSpecName: "kube-api-access-sqlmp") pod "11469ae5-954f-46a8-b8db-8eefbdccfbf6" (UID: "11469ae5-954f-46a8-b8db-8eefbdccfbf6"). InnerVolumeSpecName "kube-api-access-sqlmp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:43:02.436457 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.436423 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:02.475288 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.475254 2581 generic.go:358] "Generic (PLEG): container finished" podID="11469ae5-954f-46a8-b8db-8eefbdccfbf6" containerID="21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b" exitCode=0 Apr 23 13:43:02.475489 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.475369 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" event={"ID":"11469ae5-954f-46a8-b8db-8eefbdccfbf6","Type":"ContainerDied","Data":"21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b"} Apr 23 13:43:02.475489 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.475392 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" event={"ID":"11469ae5-954f-46a8-b8db-8eefbdccfbf6","Type":"ContainerDied","Data":"79c51ef72961179431a5b47f3c864cef0bfc5281eaccd215068d2a9176b6c908"} Apr 23 13:43:02.475489 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.475407 2581 scope.go:117] "RemoveContainer" containerID="21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b" Apr 23 13:43:02.475651 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.475515 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/kserve-controller-manager-6b667fdd66-pv8t4" Apr 23 13:43:02.483867 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.483848 2581 scope.go:117] "RemoveContainer" containerID="21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b" Apr 23 13:43:02.484185 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:43:02.484137 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b\": container with ID starting with 21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b not found: ID does not exist" containerID="21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b" Apr 23 13:43:02.484307 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.484192 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b"} err="failed to get container status \"21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b\": rpc error: code = NotFound desc = could not find container \"21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b\": container with ID starting with 21e4992e10c490145c229f8bedcd1a3873b97059c7bd7037e2f6db9a3421159b not found: ID does not exist" Apr 23 13:43:02.500733 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.500653 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqlmp\" (UniqueName: \"kubernetes.io/projected/11469ae5-954f-46a8-b8db-8eefbdccfbf6-kube-api-access-sqlmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:43:02.500733 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.500686 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/11469ae5-954f-46a8-b8db-8eefbdccfbf6-cert\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:43:02.501583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.501552 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-pv8t4"] Apr 23 13:43:02.507031 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.507006 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-pv8t4"] Apr 23 13:43:02.566767 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:02.566746 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/kserve-controller-manager-6b667fdd66-srkvt"] Apr 23 13:43:02.569263 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:43:02.569233 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod474680a9_b7bc_4845_94e6_6cf0a462a53b.slice/crio-e3c01a4af27343e15cbde0e12dd4ae738b16de50ac0fe7bd09b81270dd2ca4e0 WatchSource:0}: Error finding container e3c01a4af27343e15cbde0e12dd4ae738b16de50ac0fe7bd09b81270dd2ca4e0: Status 404 returned error can't find the container with id e3c01a4af27343e15cbde0e12dd4ae738b16de50ac0fe7bd09b81270dd2ca4e0 Apr 23 13:43:03.481627 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:03.481593 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" event={"ID":"474680a9-b7bc-4845-94e6-6cf0a462a53b","Type":"ContainerStarted","Data":"7ec527607b95096155ef4a8f90fa39aa20a378420d792220374f781a82c818e4"} Apr 23 13:43:03.481627 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:03.481626 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" event={"ID":"474680a9-b7bc-4845-94e6-6cf0a462a53b","Type":"ContainerStarted","Data":"e3c01a4af27343e15cbde0e12dd4ae738b16de50ac0fe7bd09b81270dd2ca4e0"} Apr 23 13:43:03.482060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:03.481649 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:03.499980 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:03.499917 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" podStartSLOduration=2.170618595 podStartE2EDuration="2.499897647s" podCreationTimestamp="2026-04-23 13:43:01 +0000 UTC" firstStartedPulling="2026-04-23 13:43:02.570686911 +0000 UTC m=+724.842616224" lastFinishedPulling="2026-04-23 13:43:02.899965961 +0000 UTC m=+725.171895276" observedRunningTime="2026-04-23 13:43:03.49887443 +0000 UTC m=+725.770803763" watchObservedRunningTime="2026-04-23 13:43:03.499897647 +0000 UTC m=+725.771826980" Apr 23 13:43:04.187931 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:04.187891 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="11469ae5-954f-46a8-b8db-8eefbdccfbf6" path="/var/lib/kubelet/pods/11469ae5-954f-46a8-b8db-8eefbdccfbf6/volumes" Apr 23 13:43:34.489655 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:34.489619 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/kserve-controller-manager-6b667fdd66-srkvt" Apr 23 13:43:35.374280 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.374249 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/model-serving-api-86f7b4b499-vl94g"] Apr 23 13:43:35.374627 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.374612 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="11469ae5-954f-46a8-b8db-8eefbdccfbf6" containerName="manager" Apr 23 13:43:35.374627 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.374626 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="11469ae5-954f-46a8-b8db-8eefbdccfbf6" containerName="manager" Apr 23 13:43:35.374724 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.374683 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="11469ae5-954f-46a8-b8db-8eefbdccfbf6" containerName="manager" Apr 23 13:43:35.378001 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.377979 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:35.381482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.381462 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-dockercfg-h7tfh\"" Apr 23 13:43:35.381673 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.381649 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"model-serving-api-tls\"" Apr 23 13:43:35.390558 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.390530 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-vl94g"] Apr 23 13:43:35.459861 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.459818 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6kzfs\" (UniqueName: \"kubernetes.io/projected/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-kube-api-access-6kzfs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:35.459861 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.459865 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:35.560456 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.560423 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6kzfs\" (UniqueName: \"kubernetes.io/projected/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-kube-api-access-6kzfs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:35.560869 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.560461 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:35.560869 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:43:35.560595 2581 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 13:43:35.560869 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:43:35.560662 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs podName:b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb nodeName:}" failed. No retries permitted until 2026-04-23 13:43:36.060646631 +0000 UTC m=+758.332575944 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs") pod "model-serving-api-86f7b4b499-vl94g" (UID: "b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb") : secret "model-serving-api-tls" not found Apr 23 13:43:35.569619 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:35.569585 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6kzfs\" (UniqueName: \"kubernetes.io/projected/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-kube-api-access-6kzfs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:36.063719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:36.063677 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:36.063909 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:43:36.063832 2581 secret.go:189] Couldn't get secret kserve/model-serving-api-tls: secret "model-serving-api-tls" not found Apr 23 13:43:36.063909 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:43:36.063906 2581 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs podName:b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb nodeName:}" failed. No retries permitted until 2026-04-23 13:43:37.063884671 +0000 UTC m=+759.335813994 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs") pod "model-serving-api-86f7b4b499-vl94g" (UID: "b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb") : secret "model-serving-api-tls" not found Apr 23 13:43:37.072925 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:37.072886 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:37.075474 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:37.075450 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb-tls-certs\") pod \"model-serving-api-86f7b4b499-vl94g\" (UID: \"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb\") " pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:37.190912 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:37.190874 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:37.316386 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:37.316354 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/model-serving-api-86f7b4b499-vl94g"] Apr 23 13:43:37.319638 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:43:37.319607 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb3562eb0_0a43_4b3f_8b7b_6d5b4346cadb.slice/crio-230b2e9cbef9cda8802a15f73dd2ae37c7e7ae4877953d0789355780fdf20bda WatchSource:0}: Error finding container 230b2e9cbef9cda8802a15f73dd2ae37c7e7ae4877953d0789355780fdf20bda: Status 404 returned error can't find the container with id 230b2e9cbef9cda8802a15f73dd2ae37c7e7ae4877953d0789355780fdf20bda Apr 23 13:43:37.602752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:37.602662 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-vl94g" event={"ID":"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb","Type":"ContainerStarted","Data":"230b2e9cbef9cda8802a15f73dd2ae37c7e7ae4877953d0789355780fdf20bda"} Apr 23 13:43:38.608589 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:38.608553 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/model-serving-api-86f7b4b499-vl94g" event={"ID":"b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb","Type":"ContainerStarted","Data":"9f4c412124bc42e0e52ae22dd321e9f61ccca2298f20cbaf169495a073957c97"} Apr 23 13:43:38.608999 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:38.608651 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:43:38.624698 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:38.624535 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/model-serving-api-86f7b4b499-vl94g" podStartSLOduration=2.417015322 podStartE2EDuration="3.624521617s" podCreationTimestamp="2026-04-23 13:43:35 +0000 UTC" firstStartedPulling="2026-04-23 13:43:37.321517437 +0000 UTC m=+759.593446749" lastFinishedPulling="2026-04-23 13:43:38.529023728 +0000 UTC m=+760.800953044" observedRunningTime="2026-04-23 13:43:38.623770057 +0000 UTC m=+760.895699390" watchObservedRunningTime="2026-04-23 13:43:38.624521617 +0000 UTC m=+760.896450951" Apr 23 13:43:49.617316 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:43:49.617284 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/model-serving-api-86f7b4b499-vl94g" Apr 23 13:44:21.022591 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.022524 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh"] Apr 23 13:44:21.031269 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.031232 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.035247 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.035222 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:44:21.035387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.035222 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 23 13:44:21.035387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.035223 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-epp-sa-dockercfg-nfpdt\"" Apr 23 13:44:21.035387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.035222 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:44:21.035387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.035229 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:44:21.039275 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.039214 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh"] Apr 23 13:44:21.065126 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.065096 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrpdc\" (UniqueName: \"kubernetes.io/projected/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kube-api-access-xrpdc\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.065126 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.065136 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.065349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.065179 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.065349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.065237 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.065349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.065322 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.065448 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.065369 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166284 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166246 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166465 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166303 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166465 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166334 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166465 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166375 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166465 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166420 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166669 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166493 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrpdc\" (UniqueName: \"kubernetes.io/projected/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kube-api-access-xrpdc\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166759 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166736 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-tmp\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166826 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166772 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-cache\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166856 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-uds\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.166881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.166864 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.169057 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.169038 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tls-certs\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.174365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.174345 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrpdc\" (UniqueName: \"kubernetes.io/projected/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kube-api-access-xrpdc\") pod \"scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.343438 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.343333 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:21.477683 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.477654 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh"] Apr 23 13:44:21.479062 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:44:21.479038 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda1e97dd_c0f0_4ed4_a68e_6167775a6bdf.slice/crio-eb8e76a13ee91f62ac41e018a20f419dfe7fca8987fec8de282acac6dd665a68 WatchSource:0}: Error finding container eb8e76a13ee91f62ac41e018a20f419dfe7fca8987fec8de282acac6dd665a68: Status 404 returned error can't find the container with id eb8e76a13ee91f62ac41e018a20f419dfe7fca8987fec8de282acac6dd665a68 Apr 23 13:44:21.765379 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:21.765340 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerStarted","Data":"eb8e76a13ee91f62ac41e018a20f419dfe7fca8987fec8de282acac6dd665a68"} Apr 23 13:44:24.777387 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:24.777343 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerStarted","Data":"2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9"} Apr 23 13:44:25.782182 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:25.782122 2581 generic.go:358] "Generic (PLEG): container finished" podID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerID="2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9" exitCode=0 Apr 23 13:44:25.782555 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:25.782199 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerDied","Data":"2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9"} Apr 23 13:44:26.942240 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:26.942218 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:44:27.793783 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:27.793742 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerStarted","Data":"d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964"} Apr 23 13:44:57.925618 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:57.925582 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerStarted","Data":"7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071"} Apr 23 13:44:57.926068 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:57.925936 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:57.928691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:57.928665 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:44:57.949509 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:44:57.949383 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" podStartSLOduration=0.729363757 podStartE2EDuration="36.949363754s" podCreationTimestamp="2026-04-23 13:44:21 +0000 UTC" firstStartedPulling="2026-04-23 13:44:21.480934133 +0000 UTC m=+803.752863445" lastFinishedPulling="2026-04-23 13:44:57.700934127 +0000 UTC m=+839.972863442" observedRunningTime="2026-04-23 13:44:57.947497166 +0000 UTC m=+840.219426499" watchObservedRunningTime="2026-04-23 13:44:57.949363754 +0000 UTC m=+840.221293090" Apr 23 13:45:01.343686 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:01.343641 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:45:01.343686 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:01.343687 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:45:11.345749 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:11.345719 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:45:11.346877 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:11.346861 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:45:31.643128 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:31.643096 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh"] Apr 23 13:45:31.643601 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:31.643455 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="main" containerID="cri-o://d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964" gracePeriod=30 Apr 23 13:45:31.643601 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:31.643518 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="tokenizer" containerID="cri-o://7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071" gracePeriod=30 Apr 23 13:45:32.057603 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:32.057505 2581 generic.go:358] "Generic (PLEG): container finished" podID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerID="d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964" exitCode=0 Apr 23 13:45:32.057603 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:32.057559 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerDied","Data":"d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964"} Apr 23 13:45:32.992768 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:32.992741 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:45:33.062560 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.062466 2581 generic.go:358] "Generic (PLEG): container finished" podID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerID="7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071" exitCode=0 Apr 23 13:45:33.062560 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.062551 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" Apr 23 13:45:33.062560 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.062552 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerDied","Data":"7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071"} Apr 23 13:45:33.062794 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.062592 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh" event={"ID":"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf","Type":"ContainerDied","Data":"eb8e76a13ee91f62ac41e018a20f419dfe7fca8987fec8de282acac6dd665a68"} Apr 23 13:45:33.062794 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.062612 2581 scope.go:117] "RemoveContainer" containerID="7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071" Apr 23 13:45:33.071028 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.071007 2581 scope.go:117] "RemoveContainer" containerID="d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964" Apr 23 13:45:33.078923 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.078893 2581 scope.go:117] "RemoveContainer" containerID="2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9" Apr 23 13:45:33.086721 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.086698 2581 scope.go:117] "RemoveContainer" containerID="7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071" Apr 23 13:45:33.086980 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:45:33.086961 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071\": container with ID starting with 7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071 not found: ID does not exist" containerID="7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071" Apr 23 13:45:33.087033 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.086989 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071"} err="failed to get container status \"7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071\": rpc error: code = NotFound desc = could not find container \"7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071\": container with ID starting with 7eec7c00e590898c86e4065f3f98aab7a0455102c92417d384b12e87066b1071 not found: ID does not exist" Apr 23 13:45:33.087033 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.087007 2581 scope.go:117] "RemoveContainer" containerID="d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964" Apr 23 13:45:33.087308 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:45:33.087291 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964\": container with ID starting with d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964 not found: ID does not exist" containerID="d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964" Apr 23 13:45:33.087356 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.087313 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964"} err="failed to get container status \"d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964\": rpc error: code = NotFound desc = could not find container \"d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964\": container with ID starting with d5a0bdb5387c460360586ef29a2ad9269ff7f822c64823c7bfbec4991e76f964 not found: ID does not exist" Apr 23 13:45:33.087356 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.087326 2581 scope.go:117] "RemoveContainer" containerID="2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9" Apr 23 13:45:33.087497 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:45:33.087483 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9\": container with ID starting with 2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9 not found: ID does not exist" containerID="2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9" Apr 23 13:45:33.087536 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.087503 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9"} err="failed to get container status \"2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9\": rpc error: code = NotFound desc = could not find container \"2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9\": container with ID starting with 2f1dc8b8b9902ec8c1c8b77ffb860040f7a6fcb327ff9a5b552e9933e6f7e9f9 not found: ID does not exist" Apr 23 13:45:33.134026 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.133990 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrpdc\" (UniqueName: \"kubernetes.io/projected/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kube-api-access-xrpdc\") pod \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " Apr 23 13:45:33.134245 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134097 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-tmp\") pod \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " Apr 23 13:45:33.134245 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134143 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kserve-provision-location\") pod \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " Apr 23 13:45:33.134245 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134198 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-cache\") pod \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " Apr 23 13:45:33.134245 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134236 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tls-certs\") pod \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " Apr 23 13:45:33.134475 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134274 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-uds\") pod \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\" (UID: \"da1e97dd-c0f0-4ed4-a68e-6167775a6bdf\") " Apr 23 13:45:33.134544 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134513 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" (UID: "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:33.134601 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134573 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" (UID: "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:33.134601 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.134494 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" (UID: "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:33.135236 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.135209 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" (UID: "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:33.136527 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.136509 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kube-api-access-xrpdc" (OuterVolumeSpecName: "kube-api-access-xrpdc") pod "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" (UID: "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf"). InnerVolumeSpecName "kube-api-access-xrpdc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:45:33.136637 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.136613 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" (UID: "da1e97dd-c0f0-4ed4-a68e-6167775a6bdf"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:45:33.234970 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.234928 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:45:33.234970 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.234962 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrpdc\" (UniqueName: \"kubernetes.io/projected/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kube-api-access-xrpdc\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:45:33.234970 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.234972 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:45:33.235225 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.234982 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:45:33.235225 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.234991 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:45:33.235225 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.235000 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:45:33.389954 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.389926 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh"] Apr 23 13:45:33.396545 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:33.396513 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-router-scheduler-665bbhtmlh"] Apr 23 13:45:34.187955 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:34.187919 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" path="/var/lib/kubelet/pods/da1e97dd-c0f0-4ed4-a68e-6167775a6bdf/volumes" Apr 23 13:45:40.995305 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995207 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq"] Apr 23 13:45:40.995805 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995784 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="tokenizer" Apr 23 13:45:40.995881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995809 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="tokenizer" Apr 23 13:45:40.995881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995829 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="main" Apr 23 13:45:40.995881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995838 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="main" Apr 23 13:45:40.995881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995871 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="storage-initializer" Apr 23 13:45:40.995881 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995882 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="storage-initializer" Apr 23 13:45:40.996133 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995970 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="main" Apr 23 13:45:40.996133 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:40.995987 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="da1e97dd-c0f0-4ed4-a68e-6167775a6bdf" containerName="tokenizer" Apr 23 13:45:41.001277 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.001255 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.004542 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.004516 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 23 13:45:41.004798 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.004780 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:45:41.004883 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.004851 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:45:41.005533 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.005511 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-74vn9\"" Apr 23 13:45:41.005632 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.005570 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:45:41.014134 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.014101 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq"] Apr 23 13:45:41.107557 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.107524 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.107739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.107577 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.107739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.107617 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.107739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.107666 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.107739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.107691 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vwlcv\" (UniqueName: \"kubernetes.io/projected/25f25dda-cd95-4e18-b415-907fd1770cc9-kube-api-access-vwlcv\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.107739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.107724 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25f25dda-cd95-4e18-b415-907fd1770cc9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209202 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209116 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209202 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209213 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209429 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209250 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209429 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209276 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vwlcv\" (UniqueName: \"kubernetes.io/projected/25f25dda-cd95-4e18-b415-907fd1770cc9-kube-api-access-vwlcv\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209429 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209406 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25f25dda-cd95-4e18-b415-907fd1770cc9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209568 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209452 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209568 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209538 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209675 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209604 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209675 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209643 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.209778 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.209739 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.212036 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.212012 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25f25dda-cd95-4e18-b415-907fd1770cc9-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.218215 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.218191 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vwlcv\" (UniqueName: \"kubernetes.io/projected/25f25dda-cd95-4e18-b415-907fd1770cc9-kube-api-access-vwlcv\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.313382 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.313288 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:41.444880 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:41.444853 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq"] Apr 23 13:45:41.447646 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:45:41.447612 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod25f25dda_cd95_4e18_b415_907fd1770cc9.slice/crio-387e25982e7dcb6bb52d7b0e20b9add9a5d66f07288750c3719bcabc6d4dc9c7 WatchSource:0}: Error finding container 387e25982e7dcb6bb52d7b0e20b9add9a5d66f07288750c3719bcabc6d4dc9c7: Status 404 returned error can't find the container with id 387e25982e7dcb6bb52d7b0e20b9add9a5d66f07288750c3719bcabc6d4dc9c7 Apr 23 13:45:42.094804 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:42.094766 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerStarted","Data":"004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946"} Apr 23 13:45:42.094804 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:42.094806 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerStarted","Data":"387e25982e7dcb6bb52d7b0e20b9add9a5d66f07288750c3719bcabc6d4dc9c7"} Apr 23 13:45:43.099367 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:43.099332 2581 generic.go:358] "Generic (PLEG): container finished" podID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerID="004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946" exitCode=0 Apr 23 13:45:43.099842 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:43.099420 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerDied","Data":"004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946"} Apr 23 13:45:44.104644 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:44.104600 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerStarted","Data":"cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072"} Apr 23 13:45:44.105040 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:44.104653 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerStarted","Data":"6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13"} Apr 23 13:45:44.105040 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:44.104699 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:44.126554 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:44.126493 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" podStartSLOduration=4.12647616 podStartE2EDuration="4.12647616s" podCreationTimestamp="2026-04-23 13:45:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:45:44.124408228 +0000 UTC m=+886.396337562" watchObservedRunningTime="2026-04-23 13:45:44.12647616 +0000 UTC m=+886.398405493" Apr 23 13:45:50.630096 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.630061 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t"] Apr 23 13:45:50.634741 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.634713 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.638323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.638297 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 23 13:45:50.638323 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.638318 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-wqvc9\"" Apr 23 13:45:50.683321 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.683280 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t"] Apr 23 13:45:50.795263 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.795219 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.795446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.795283 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.795446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.795344 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ee408a-f236-4aac-8261-a698bead5f68-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.795446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.795370 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xcnms\" (UniqueName: \"kubernetes.io/projected/b1ee408a-f236-4aac-8261-a698bead5f68-kube-api-access-xcnms\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.795446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.795397 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.795446 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.795422 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896495 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896409 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896495 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896712 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896509 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ee408a-f236-4aac-8261-a698bead5f68-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896712 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896533 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xcnms\" (UniqueName: \"kubernetes.io/projected/b1ee408a-f236-4aac-8261-a698bead5f68-kube-api-access-xcnms\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896712 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896559 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896712 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896585 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896903 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896843 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896959 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896897 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.896959 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.896945 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.897061 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.897027 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.899428 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.899408 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ee408a-f236-4aac-8261-a698bead5f68-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.908336 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.908314 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xcnms\" (UniqueName: \"kubernetes.io/projected/b1ee408a-f236-4aac-8261-a698bead5f68-kube-api-access-xcnms\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:50.944237 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:50.944195 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:51.080612 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:51.080587 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t"] Apr 23 13:45:51.082656 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:45:51.082626 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb1ee408a_f236_4aac_8261_a698bead5f68.slice/crio-63782d6f0c9138cf0d2846db483a107c6f0e248bf3389709dd7dfa675eadd69a WatchSource:0}: Error finding container 63782d6f0c9138cf0d2846db483a107c6f0e248bf3389709dd7dfa675eadd69a: Status 404 returned error can't find the container with id 63782d6f0c9138cf0d2846db483a107c6f0e248bf3389709dd7dfa675eadd69a Apr 23 13:45:51.143720 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:51.143690 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerStarted","Data":"63782d6f0c9138cf0d2846db483a107c6f0e248bf3389709dd7dfa675eadd69a"} Apr 23 13:45:51.314201 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:51.314143 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:51.314412 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:51.314213 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:51.317597 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:51.317567 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:52.149604 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:52.149565 2581 generic.go:358] "Generic (PLEG): container finished" podID="b1ee408a-f236-4aac-8261-a698bead5f68" containerID="0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9" exitCode=0 Apr 23 13:45:52.150100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:52.149663 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerDied","Data":"0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9"} Apr 23 13:45:52.151500 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:52.151377 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:45:53.155439 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:53.155393 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerStarted","Data":"e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8"} Apr 23 13:45:53.155439 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:53.155438 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerStarted","Data":"4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b"} Apr 23 13:45:53.156002 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:53.155718 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:45:53.182497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:53.182426 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" podStartSLOduration=3.182404924 podStartE2EDuration="3.182404924s" podCreationTimestamp="2026-04-23 13:45:50 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:45:53.17985895 +0000 UTC m=+895.451788312" watchObservedRunningTime="2026-04-23 13:45:53.182404924 +0000 UTC m=+895.454334261" Apr 23 13:45:58.206867 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:58.206836 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:45:58.209454 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:45:58.209432 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:46:00.945339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:00.945296 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:46:00.945339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:00.945347 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:46:00.948081 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:00.948058 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:46:01.187789 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:01.187758 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:46:13.157213 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:13.157183 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:46:22.192071 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:22.192045 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:46:35.408603 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:35.408568 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq"] Apr 23 13:46:35.410853 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:35.408978 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="main" containerID="cri-o://6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13" gracePeriod=30 Apr 23 13:46:35.410853 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:35.409022 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="tokenizer" containerID="cri-o://cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072" gracePeriod=30 Apr 23 13:46:36.312378 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.312337 2581 generic.go:358] "Generic (PLEG): container finished" podID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerID="6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13" exitCode=0 Apr 23 13:46:36.312545 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.312412 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerDied","Data":"6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13"} Apr 23 13:46:36.760516 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.760490 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:46:36.823884 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.823807 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25f25dda-cd95-4e18-b415-907fd1770cc9-tls-certs\") pod \"25f25dda-cd95-4e18-b415-907fd1770cc9\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " Apr 23 13:46:36.823884 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.823854 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-uds\") pod \"25f25dda-cd95-4e18-b415-907fd1770cc9\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " Apr 23 13:46:36.823884 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.823874 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-kserve-provision-location\") pod \"25f25dda-cd95-4e18-b415-907fd1770cc9\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " Apr 23 13:46:36.824172 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.823909 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-cache\") pod \"25f25dda-cd95-4e18-b415-907fd1770cc9\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " Apr 23 13:46:36.824172 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.823934 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-tmp\") pod \"25f25dda-cd95-4e18-b415-907fd1770cc9\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " Apr 23 13:46:36.824172 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.823984 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vwlcv\" (UniqueName: \"kubernetes.io/projected/25f25dda-cd95-4e18-b415-907fd1770cc9-kube-api-access-vwlcv\") pod \"25f25dda-cd95-4e18-b415-907fd1770cc9\" (UID: \"25f25dda-cd95-4e18-b415-907fd1770cc9\") " Apr 23 13:46:36.824332 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.824144 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "25f25dda-cd95-4e18-b415-907fd1770cc9" (UID: "25f25dda-cd95-4e18-b415-907fd1770cc9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:36.824332 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.824254 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "25f25dda-cd95-4e18-b415-907fd1770cc9" (UID: "25f25dda-cd95-4e18-b415-907fd1770cc9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:36.824427 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.824339 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:46:36.824427 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.824360 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:46:36.824824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.824798 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "25f25dda-cd95-4e18-b415-907fd1770cc9" (UID: "25f25dda-cd95-4e18-b415-907fd1770cc9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:36.825190 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.825144 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "25f25dda-cd95-4e18-b415-907fd1770cc9" (UID: "25f25dda-cd95-4e18-b415-907fd1770cc9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:46:36.826262 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.826128 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/25f25dda-cd95-4e18-b415-907fd1770cc9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "25f25dda-cd95-4e18-b415-907fd1770cc9" (UID: "25f25dda-cd95-4e18-b415-907fd1770cc9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:46:36.826346 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.826324 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/25f25dda-cd95-4e18-b415-907fd1770cc9-kube-api-access-vwlcv" (OuterVolumeSpecName: "kube-api-access-vwlcv") pod "25f25dda-cd95-4e18-b415-907fd1770cc9" (UID: "25f25dda-cd95-4e18-b415-907fd1770cc9"). InnerVolumeSpecName "kube-api-access-vwlcv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:46:36.924975 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.924939 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vwlcv\" (UniqueName: \"kubernetes.io/projected/25f25dda-cd95-4e18-b415-907fd1770cc9-kube-api-access-vwlcv\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:46:36.924975 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.924968 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/25f25dda-cd95-4e18-b415-907fd1770cc9-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:46:36.924975 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.924980 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:46:36.924975 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:36.924988 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/25f25dda-cd95-4e18-b415-907fd1770cc9-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:46:37.317939 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.317900 2581 generic.go:358] "Generic (PLEG): container finished" podID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerID="cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072" exitCode=0 Apr 23 13:46:37.318135 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.317946 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerDied","Data":"cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072"} Apr 23 13:46:37.318135 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.317975 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" event={"ID":"25f25dda-cd95-4e18-b415-907fd1770cc9","Type":"ContainerDied","Data":"387e25982e7dcb6bb52d7b0e20b9add9a5d66f07288750c3719bcabc6d4dc9c7"} Apr 23 13:46:37.318135 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.317978 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq" Apr 23 13:46:37.318135 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.317994 2581 scope.go:117] "RemoveContainer" containerID="cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072" Apr 23 13:46:37.328081 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.328064 2581 scope.go:117] "RemoveContainer" containerID="6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13" Apr 23 13:46:37.335836 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.335818 2581 scope.go:117] "RemoveContainer" containerID="004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946" Apr 23 13:46:37.341872 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.341847 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq"] Apr 23 13:46:37.344968 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.344954 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-6bff8pqnwq"] Apr 23 13:46:37.345029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.345011 2581 scope.go:117] "RemoveContainer" containerID="cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072" Apr 23 13:46:37.345380 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:46:37.345359 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072\": container with ID starting with cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072 not found: ID does not exist" containerID="cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072" Apr 23 13:46:37.345468 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.345387 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072"} err="failed to get container status \"cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072\": rpc error: code = NotFound desc = could not find container \"cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072\": container with ID starting with cbf2f20a745df11e9fd0cdee7f38a4098d5220a1e2f27bfd4f63c73266656072 not found: ID does not exist" Apr 23 13:46:37.345468 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.345405 2581 scope.go:117] "RemoveContainer" containerID="6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13" Apr 23 13:46:37.345693 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:46:37.345672 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13\": container with ID starting with 6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13 not found: ID does not exist" containerID="6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13" Apr 23 13:46:37.345780 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.345703 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13"} err="failed to get container status \"6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13\": rpc error: code = NotFound desc = could not find container \"6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13\": container with ID starting with 6b5612ee46c33709d11dd41b91b6e5ad323c2a18de10be53a31aaaa85528dc13 not found: ID does not exist" Apr 23 13:46:37.345780 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.345727 2581 scope.go:117] "RemoveContainer" containerID="004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946" Apr 23 13:46:37.345988 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:46:37.345972 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946\": container with ID starting with 004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946 not found: ID does not exist" containerID="004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946" Apr 23 13:46:37.346042 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:37.345996 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946"} err="failed to get container status \"004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946\": rpc error: code = NotFound desc = could not find container \"004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946\": container with ID starting with 004804d48c9f5fd01349ccc0edf48371f20519fa19c639e22c9bea6500a87946 not found: ID does not exist" Apr 23 13:46:38.188637 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:38.188605 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" path="/var/lib/kubelet/pods/25f25dda-cd95-4e18-b415-907fd1770cc9/volumes" Apr 23 13:46:39.047257 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047219 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x"] Apr 23 13:46:39.047678 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047662 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="tokenizer" Apr 23 13:46:39.047731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047682 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="tokenizer" Apr 23 13:46:39.047731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047704 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="storage-initializer" Apr 23 13:46:39.047731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047710 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="storage-initializer" Apr 23 13:46:39.047731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047717 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="main" Apr 23 13:46:39.047731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047723 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="main" Apr 23 13:46:39.047880 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047781 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="tokenizer" Apr 23 13:46:39.047880 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.047789 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="25f25dda-cd95-4e18-b415-907fd1770cc9" containerName="main" Apr 23 13:46:39.052592 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.052569 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.055886 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.055861 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 13:46:39.055996 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.055896 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-jw9ff\"" Apr 23 13:46:39.062697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.062671 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x"] Apr 23 13:46:39.143330 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.143293 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.143330 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.143334 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.143583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.143372 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.143583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.143445 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p89p5\" (UniqueName: \"kubernetes.io/projected/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kube-api-access-p89p5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.143583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.143498 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.143583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.143525 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.243965 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.243932 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.243972 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244010 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244032 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244066 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244094 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p89p5\" (UniqueName: \"kubernetes.io/projected/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kube-api-access-p89p5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244433 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244666 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244457 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244666 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244504 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.244666 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.244545 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.246740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.246720 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.251924 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.251904 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p89p5\" (UniqueName: \"kubernetes.io/projected/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kube-api-access-p89p5\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.363298 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.363263 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:39.495060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:39.495012 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x"] Apr 23 13:46:39.498049 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:46:39.498019 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5bcc5085_cd00_4122_9d72_c8aa4d76dc5c.slice/crio-c3a726c9fa1ea6c5d2b192aec4775c975262b265920e83c2624a7b8871cf1223 WatchSource:0}: Error finding container c3a726c9fa1ea6c5d2b192aec4775c975262b265920e83c2624a7b8871cf1223: Status 404 returned error can't find the container with id c3a726c9fa1ea6c5d2b192aec4775c975262b265920e83c2624a7b8871cf1223 Apr 23 13:46:40.330686 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:40.330650 2581 generic.go:358] "Generic (PLEG): container finished" podID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerID="c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85" exitCode=0 Apr 23 13:46:40.331056 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:40.330696 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerDied","Data":"c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85"} Apr 23 13:46:40.331056 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:40.330718 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerStarted","Data":"c3a726c9fa1ea6c5d2b192aec4775c975262b265920e83c2624a7b8871cf1223"} Apr 23 13:46:41.335842 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:41.335801 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerStarted","Data":"4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7"} Apr 23 13:46:41.336257 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:41.335848 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerStarted","Data":"dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e"} Apr 23 13:46:41.336257 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:41.335927 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:41.354535 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:41.354459 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" podStartSLOduration=2.354442889 podStartE2EDuration="2.354442889s" podCreationTimestamp="2026-04-23 13:46:39 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:46:41.353121052 +0000 UTC m=+943.625050386" watchObservedRunningTime="2026-04-23 13:46:41.354442889 +0000 UTC m=+943.626372236" Apr 23 13:46:49.363451 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:49.363417 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:49.363451 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:49.363458 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:49.366606 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:49.366581 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:46:50.371069 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:46:50.371038 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:47:11.374696 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:11.374620 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:47:13.059018 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:13.058975 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x"] Apr 23 13:47:13.059973 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:13.059923 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="main" containerID="cri-o://dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e" gracePeriod=30 Apr 23 13:47:13.059973 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:13.059953 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="tokenizer" containerID="cri-o://4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7" gracePeriod=30 Apr 23 13:47:13.449596 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:13.449558 2581 generic.go:358] "Generic (PLEG): container finished" podID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerID="dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e" exitCode=0 Apr 23 13:47:13.449780 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:13.449627 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerDied","Data":"dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e"} Apr 23 13:47:14.409810 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.409778 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:47:14.454547 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.454460 2581 generic.go:358] "Generic (PLEG): container finished" podID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerID="4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7" exitCode=0 Apr 23 13:47:14.454547 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.454511 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerDied","Data":"4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7"} Apr 23 13:47:14.454547 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.454539 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" event={"ID":"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c","Type":"ContainerDied","Data":"c3a726c9fa1ea6c5d2b192aec4775c975262b265920e83c2624a7b8871cf1223"} Apr 23 13:47:14.454791 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.454554 2581 scope.go:117] "RemoveContainer" containerID="4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7" Apr 23 13:47:14.454791 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.454556 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x" Apr 23 13:47:14.462818 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.462797 2581 scope.go:117] "RemoveContainer" containerID="dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e" Apr 23 13:47:14.472246 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.472223 2581 scope.go:117] "RemoveContainer" containerID="c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85" Apr 23 13:47:14.480243 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.480224 2581 scope.go:117] "RemoveContainer" containerID="4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7" Apr 23 13:47:14.480497 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:47:14.480479 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7\": container with ID starting with 4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7 not found: ID does not exist" containerID="4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7" Apr 23 13:47:14.480565 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.480507 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7"} err="failed to get container status \"4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7\": rpc error: code = NotFound desc = could not find container \"4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7\": container with ID starting with 4f60d79441961b9a00ba8925794844348f0ae84c405438c99688329038afcaa7 not found: ID does not exist" Apr 23 13:47:14.480565 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.480525 2581 scope.go:117] "RemoveContainer" containerID="dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e" Apr 23 13:47:14.480750 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:47:14.480730 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e\": container with ID starting with dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e not found: ID does not exist" containerID="dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e" Apr 23 13:47:14.480809 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.480758 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e"} err="failed to get container status \"dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e\": rpc error: code = NotFound desc = could not find container \"dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e\": container with ID starting with dc012069b8e166ee179f493cd6c660f01e1cc137b1a3e6a5b032adc087436f9e not found: ID does not exist" Apr 23 13:47:14.480809 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.480780 2581 scope.go:117] "RemoveContainer" containerID="c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85" Apr 23 13:47:14.481022 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:47:14.481006 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85\": container with ID starting with c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85 not found: ID does not exist" containerID="c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85" Apr 23 13:47:14.481062 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.481026 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85"} err="failed to get container status \"c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85\": rpc error: code = NotFound desc = could not find container \"c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85\": container with ID starting with c1a850488c09394dcec8fbed15893ac3e062c7b46078a9453ff63787d6bf4d85 not found: ID does not exist" Apr 23 13:47:14.570068 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570027 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kserve-provision-location\") pod \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " Apr 23 13:47:14.570068 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570075 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-p89p5\" (UniqueName: \"kubernetes.io/projected/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kube-api-access-p89p5\") pod \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " Apr 23 13:47:14.570356 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570103 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tls-certs\") pod \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " Apr 23 13:47:14.570356 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570187 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-cache\") pod \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " Apr 23 13:47:14.570356 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570249 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-uds\") pod \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " Apr 23 13:47:14.570356 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570289 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-tmp\") pod \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\" (UID: \"5bcc5085-cd00-4122-9d72-c8aa4d76dc5c\") " Apr 23 13:47:14.570583 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570527 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" (UID: "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:14.570756 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570673 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" (UID: "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:14.570876 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570798 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" (UID: "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:14.571010 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.570985 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" (UID: "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:14.572442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.572413 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" (UID: "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:47:14.572526 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.572481 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kube-api-access-p89p5" (OuterVolumeSpecName: "kube-api-access-p89p5") pod "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" (UID: "5bcc5085-cd00-4122-9d72-c8aa4d76dc5c"). InnerVolumeSpecName "kube-api-access-p89p5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:47:14.671352 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.671304 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:47:14.671352 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.671346 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-p89p5\" (UniqueName: \"kubernetes.io/projected/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-kube-api-access-p89p5\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:47:14.671352 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.671357 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:47:14.671585 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.671368 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:47:14.671585 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.671377 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:47:14.671585 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.671385 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:47:14.785976 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.785943 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x"] Apr 23 13:47:14.790047 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:14.790020 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c78689m42x"] Apr 23 13:47:15.195139 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195103 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv"] Apr 23 13:47:15.195536 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195523 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="main" Apr 23 13:47:15.195588 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195539 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="main" Apr 23 13:47:15.195588 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195562 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="storage-initializer" Apr 23 13:47:15.195588 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195568 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="storage-initializer" Apr 23 13:47:15.195588 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195575 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="tokenizer" Apr 23 13:47:15.195588 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195581 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="tokenizer" Apr 23 13:47:15.195751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195638 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="main" Apr 23 13:47:15.195751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.195648 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" containerName="tokenizer" Apr 23 13:47:15.200542 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.200517 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.202969 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.202944 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 23 13:47:15.206856 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.206832 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv"] Apr 23 13:47:15.377361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377328 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.377361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377366 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cx8qf\" (UniqueName: \"kubernetes.io/projected/c2843431-4c69-4a14-8870-2dde178a95c0-kube-api-access-cx8qf\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.377630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377393 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-home\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.377630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377506 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-dshm\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.377630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377580 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2843431-4c69-4a14-8870-2dde178a95c0-tls-certs\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.377630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377612 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.377793 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.377651 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-model-cache\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.408872 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.408834 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r"] Apr 23 13:47:15.412536 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.412520 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.414932 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.414913 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-epp-sa-dockercfg-bwrm7\"" Apr 23 13:47:15.422173 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.422124 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r"] Apr 23 13:47:15.478608 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478508 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478608 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478550 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cx8qf\" (UniqueName: \"kubernetes.io/projected/c2843431-4c69-4a14-8870-2dde178a95c0-kube-api-access-cx8qf\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478608 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478584 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-home\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478636 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-dshm\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478685 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2843431-4c69-4a14-8870-2dde178a95c0-tls-certs\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478710 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478747 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-model-cache\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.478936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478928 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-tmp-dir\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.479233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.478979 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-home\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.479233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.479086 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.479233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.479178 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-model-cache\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.481101 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.481079 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-dshm\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.481379 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.481360 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2843431-4c69-4a14-8870-2dde178a95c0-tls-certs\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.487639 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.487614 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cx8qf\" (UniqueName: \"kubernetes.io/projected/c2843431-4c69-4a14-8870-2dde178a95c0-kube-api-access-cx8qf\") pod \"precise-prefix-cache-test-kserve-65c6b49884-chwrv\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.513233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.513193 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:15.579523 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.579489 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.579669 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.579533 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pzp44\" (UniqueName: \"kubernetes.io/projected/fb785597-d885-4128-ba2a-0271067ae426-kube-api-access-pzp44\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.579751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.579670 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb785597-d885-4128-ba2a-0271067ae426-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.579751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.579708 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.579751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.579747 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.579877 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.579854 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.643623 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.643559 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv"] Apr 23 13:47:15.646506 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:47:15.646474 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc2843431_4c69_4a14_8870_2dde178a95c0.slice/crio-bd4b9d27a75422e0fed08cf2f4277549e4038fc65a28626c00137b57f14ae362 WatchSource:0}: Error finding container bd4b9d27a75422e0fed08cf2f4277549e4038fc65a28626c00137b57f14ae362: Status 404 returned error can't find the container with id bd4b9d27a75422e0fed08cf2f4277549e4038fc65a28626c00137b57f14ae362 Apr 23 13:47:15.681148 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681122 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb785597-d885-4128-ba2a-0271067ae426-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681287 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681177 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681287 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681204 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681287 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681238 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681287 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681261 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681287 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681282 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pzp44\" (UniqueName: \"kubernetes.io/projected/fb785597-d885-4128-ba2a-0271067ae426-kube-api-access-pzp44\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681671 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681650 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681777 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681718 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-cache\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.681777 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681746 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-uds\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.682068 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.681809 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-tmp\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.683760 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.683738 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb785597-d885-4128-ba2a-0271067ae426-tls-certs\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.690074 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.690052 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pzp44\" (UniqueName: \"kubernetes.io/projected/fb785597-d885-4128-ba2a-0271067ae426-kube-api-access-pzp44\") pod \"precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.724313 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.724283 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:15.860445 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:15.860414 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r"] Apr 23 13:47:15.862984 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:47:15.862945 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfb785597_d885_4128_ba2a_0271067ae426.slice/crio-aad73dfcf8a9c3fbd8d5d30b5ca508ee0c6538463decb1dcf6630275105a2607 WatchSource:0}: Error finding container aad73dfcf8a9c3fbd8d5d30b5ca508ee0c6538463decb1dcf6630275105a2607: Status 404 returned error can't find the container with id aad73dfcf8a9c3fbd8d5d30b5ca508ee0c6538463decb1dcf6630275105a2607 Apr 23 13:47:16.188309 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:16.188273 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5bcc5085-cd00-4122-9d72-c8aa4d76dc5c" path="/var/lib/kubelet/pods/5bcc5085-cd00-4122-9d72-c8aa4d76dc5c/volumes" Apr 23 13:47:16.465394 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:16.465300 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" event={"ID":"c2843431-4c69-4a14-8870-2dde178a95c0","Type":"ContainerStarted","Data":"0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321"} Apr 23 13:47:16.465394 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:16.465340 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" event={"ID":"c2843431-4c69-4a14-8870-2dde178a95c0","Type":"ContainerStarted","Data":"bd4b9d27a75422e0fed08cf2f4277549e4038fc65a28626c00137b57f14ae362"} Apr 23 13:47:16.466609 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:16.466585 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerStarted","Data":"fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a"} Apr 23 13:47:16.466680 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:16.466613 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerStarted","Data":"aad73dfcf8a9c3fbd8d5d30b5ca508ee0c6538463decb1dcf6630275105a2607"} Apr 23 13:47:17.472521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:17.472485 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb785597-d885-4128-ba2a-0271067ae426" containerID="fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a" exitCode=0 Apr 23 13:47:17.472952 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:17.472581 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerDied","Data":"fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a"} Apr 23 13:47:18.478797 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:18.478761 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerStarted","Data":"e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792"} Apr 23 13:47:18.478797 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:18.478799 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerStarted","Data":"9e04657e4dc5d0244645d6727de947b709f59364a2fd52926adcb6c369410134"} Apr 23 13:47:18.479262 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:18.478929 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:18.508395 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:18.508340 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" podStartSLOduration=3.508323797 podStartE2EDuration="3.508323797s" podCreationTimestamp="2026-04-23 13:47:15 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:47:18.506625766 +0000 UTC m=+980.778555133" watchObservedRunningTime="2026-04-23 13:47:18.508323797 +0000 UTC m=+980.780253130" Apr 23 13:47:20.488403 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:20.488366 2581 generic.go:358] "Generic (PLEG): container finished" podID="c2843431-4c69-4a14-8870-2dde178a95c0" containerID="0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321" exitCode=0 Apr 23 13:47:20.488896 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:20.488442 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" event={"ID":"c2843431-4c69-4a14-8870-2dde178a95c0","Type":"ContainerDied","Data":"0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321"} Apr 23 13:47:22.499186 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:22.499133 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" event={"ID":"c2843431-4c69-4a14-8870-2dde178a95c0","Type":"ContainerStarted","Data":"deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e"} Apr 23 13:47:22.521462 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:22.521411 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" podStartSLOduration=6.383422021 podStartE2EDuration="7.52139754s" podCreationTimestamp="2026-04-23 13:47:15 +0000 UTC" firstStartedPulling="2026-04-23 13:47:20.489753707 +0000 UTC m=+982.761683019" lastFinishedPulling="2026-04-23 13:47:21.627729223 +0000 UTC m=+983.899658538" observedRunningTime="2026-04-23 13:47:22.51968032 +0000 UTC m=+984.791609660" watchObservedRunningTime="2026-04-23 13:47:22.52139754 +0000 UTC m=+984.793326874" Apr 23 13:47:25.514439 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:25.514404 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:25.514439 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:25.514447 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:25.527100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:25.527064 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:25.725228 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:25.725191 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:25.725228 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:25.725237 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:25.727772 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:25.727744 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:25.727903 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:47:25.727883 2581 logging.go:55] [core] [Channel #125 SubChannel #126]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.38:9003", ServerName: "10.132.0.38:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.38:9003: connect: connection refused" Apr 23 13:47:26.515722 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:26.515694 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:26.530376 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:26.530354 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:47:26.725904 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:26.725829 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.38:9003\" within 1s: context deadline exceeded" Apr 23 13:47:28.523731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:28.523702 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r_fb785597-d885-4128-ba2a-0271067ae426/main/0.log" Apr 23 13:47:28.524182 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:28.524010 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb785597-d885-4128-ba2a-0271067ae426" containerID="9e04657e4dc5d0244645d6727de947b709f59364a2fd52926adcb6c369410134" exitCode=1 Apr 23 13:47:28.524182 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:28.524079 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerDied","Data":"9e04657e4dc5d0244645d6727de947b709f59364a2fd52926adcb6c369410134"} Apr 23 13:47:28.524545 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:28.524526 2581 scope.go:117] "RemoveContainer" containerID="9e04657e4dc5d0244645d6727de947b709f59364a2fd52926adcb6c369410134" Apr 23 13:47:29.536186 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:29.536137 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r_fb785597-d885-4128-ba2a-0271067ae426/main/0.log" Apr 23 13:47:29.536605 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:29.536583 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerStarted","Data":"408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141"} Apr 23 13:47:29.536933 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:29.536912 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:47:35.725027 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:47:35.724996 2581 logging.go:55] [core] [Channel #133 SubChannel #134]grpc: addrConn.createTransport failed to connect to {Addr: "10.132.0.38:9003", ServerName: "10.132.0.38:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.132.0.38:9003: connect: connection refused" Apr 23 13:47:36.724877 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:47:36.724828 2581 prober.go:120] "Probe failed" probeType="Liveness" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.132.0.38:9003\" within 1s: context deadline exceeded" Apr 23 13:48:00.542119 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:00.542087 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:48:01.576715 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.576681 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r"] Apr 23 13:48:01.577186 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.576987 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="tokenizer" containerID="cri-o://e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792" gracePeriod=30 Apr 23 13:48:01.577186 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.577058 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" containerID="cri-o://408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141" gracePeriod=30 Apr 23 13:48:01.586741 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.586703 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv"] Apr 23 13:48:01.587377 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.587041 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" containerName="main" containerID="cri-o://deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e" gracePeriod=30 Apr 23 13:48:01.839628 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.839559 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:48:01.929765 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929729 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-kserve-provision-location\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.929962 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929783 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-home\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.929962 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929803 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-dshm\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.929962 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929834 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-model-cache\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.929962 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929869 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cx8qf\" (UniqueName: \"kubernetes.io/projected/c2843431-4c69-4a14-8870-2dde178a95c0-kube-api-access-cx8qf\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.929962 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929907 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-tmp-dir\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.929962 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.929934 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2843431-4c69-4a14-8870-2dde178a95c0-tls-certs\") pod \"c2843431-4c69-4a14-8870-2dde178a95c0\" (UID: \"c2843431-4c69-4a14-8870-2dde178a95c0\") " Apr 23 13:48:01.930364 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.930088 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-home" (OuterVolumeSpecName: "home") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.930364 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.930129 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-model-cache" (OuterVolumeSpecName: "model-cache") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.930364 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.930302 2581 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-home\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:01.930364 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.930324 2581 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-model-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:01.930364 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.930330 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.932294 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.932264 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c2843431-4c69-4a14-8870-2dde178a95c0-kube-api-access-cx8qf" (OuterVolumeSpecName: "kube-api-access-cx8qf") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "kube-api-access-cx8qf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:48:01.932435 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.932412 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-dshm" (OuterVolumeSpecName: "dshm") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:01.932521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.932505 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c2843431-4c69-4a14-8870-2dde178a95c0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:48:01.989190 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:01.989093 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c2843431-4c69-4a14-8870-2dde178a95c0" (UID: "c2843431-4c69-4a14-8870-2dde178a95c0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:02.031361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.031314 2581 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-dshm\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.031361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.031349 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cx8qf\" (UniqueName: \"kubernetes.io/projected/c2843431-4c69-4a14-8870-2dde178a95c0-kube-api-access-cx8qf\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.031361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.031360 2581 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-tmp-dir\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.031361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.031370 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c2843431-4c69-4a14-8870-2dde178a95c0-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.031361 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.031378 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c2843431-4c69-4a14-8870-2dde178a95c0-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:02.660297 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.660265 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r_fb785597-d885-4128-ba2a-0271067ae426/main/0.log" Apr 23 13:48:02.660739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.660597 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb785597-d885-4128-ba2a-0271067ae426" containerID="408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141" exitCode=0 Apr 23 13:48:02.660739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.660674 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerDied","Data":"408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141"} Apr 23 13:48:02.660739 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.660719 2581 scope.go:117] "RemoveContainer" containerID="9e04657e4dc5d0244645d6727de947b709f59364a2fd52926adcb6c369410134" Apr 23 13:48:02.662231 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.662211 2581 generic.go:358] "Generic (PLEG): container finished" podID="c2843431-4c69-4a14-8870-2dde178a95c0" containerID="deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e" exitCode=0 Apr 23 13:48:02.662334 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.662272 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" event={"ID":"c2843431-4c69-4a14-8870-2dde178a95c0","Type":"ContainerDied","Data":"deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e"} Apr 23 13:48:02.662334 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.662293 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" event={"ID":"c2843431-4c69-4a14-8870-2dde178a95c0","Type":"ContainerDied","Data":"bd4b9d27a75422e0fed08cf2f4277549e4038fc65a28626c00137b57f14ae362"} Apr 23 13:48:02.662449 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.662331 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv" Apr 23 13:48:02.674379 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.674357 2581 scope.go:117] "RemoveContainer" containerID="deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e" Apr 23 13:48:02.682173 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.682129 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv"] Apr 23 13:48:02.683120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.683098 2581 scope.go:117] "RemoveContainer" containerID="0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321" Apr 23 13:48:02.685362 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.685340 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-65c6b49884-chwrv"] Apr 23 13:48:02.751334 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.751313 2581 scope.go:117] "RemoveContainer" containerID="deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e" Apr 23 13:48:02.751694 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:02.751671 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e\": container with ID starting with deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e not found: ID does not exist" containerID="deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e" Apr 23 13:48:02.751776 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.751708 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e"} err="failed to get container status \"deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e\": rpc error: code = NotFound desc = could not find container \"deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e\": container with ID starting with deb125fa32741ff5f365c2d8f5c1ee6154f8349a3bc4b761712ae43f226f717e not found: ID does not exist" Apr 23 13:48:02.751776 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.751736 2581 scope.go:117] "RemoveContainer" containerID="0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321" Apr 23 13:48:02.751973 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:02.751949 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321\": container with ID starting with 0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321 not found: ID does not exist" containerID="0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321" Apr 23 13:48:02.752043 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:02.751982 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321"} err="failed to get container status \"0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321\": rpc error: code = NotFound desc = could not find container \"0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321\": container with ID starting with 0411a4a86510af42b2b5c137f386d0220946a84b794ef315cd8d1ae60c6a3321 not found: ID does not exist" Apr 23 13:48:03.233524 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.233501 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:48:03.344508 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344423 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-kserve-provision-location\") pod \"fb785597-d885-4128-ba2a-0271067ae426\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " Apr 23 13:48:03.344508 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344484 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-uds\") pod \"fb785597-d885-4128-ba2a-0271067ae426\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " Apr 23 13:48:03.344752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344523 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-pzp44\" (UniqueName: \"kubernetes.io/projected/fb785597-d885-4128-ba2a-0271067ae426-kube-api-access-pzp44\") pod \"fb785597-d885-4128-ba2a-0271067ae426\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " Apr 23 13:48:03.344752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344592 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-tmp\") pod \"fb785597-d885-4128-ba2a-0271067ae426\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " Apr 23 13:48:03.344752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344631 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-cache\") pod \"fb785597-d885-4128-ba2a-0271067ae426\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " Apr 23 13:48:03.344898 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344782 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "fb785597-d885-4128-ba2a-0271067ae426" (UID: "fb785597-d885-4128-ba2a-0271067ae426"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:03.344898 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344795 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb785597-d885-4128-ba2a-0271067ae426-tls-certs\") pod \"fb785597-d885-4128-ba2a-0271067ae426\" (UID: \"fb785597-d885-4128-ba2a-0271067ae426\") " Apr 23 13:48:03.345004 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.344972 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "fb785597-d885-4128-ba2a-0271067ae426" (UID: "fb785597-d885-4128-ba2a-0271067ae426"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:03.345065 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.345034 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "fb785597-d885-4128-ba2a-0271067ae426" (UID: "fb785597-d885-4128-ba2a-0271067ae426"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:03.345205 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.345139 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:03.345205 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.345196 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:03.345205 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.345209 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:03.345610 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.345288 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "fb785597-d885-4128-ba2a-0271067ae426" (UID: "fb785597-d885-4128-ba2a-0271067ae426"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:03.346968 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.346947 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/fb785597-d885-4128-ba2a-0271067ae426-kube-api-access-pzp44" (OuterVolumeSpecName: "kube-api-access-pzp44") pod "fb785597-d885-4128-ba2a-0271067ae426" (UID: "fb785597-d885-4128-ba2a-0271067ae426"). InnerVolumeSpecName "kube-api-access-pzp44". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:48:03.347052 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.346960 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/fb785597-d885-4128-ba2a-0271067ae426-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "fb785597-d885-4128-ba2a-0271067ae426" (UID: "fb785597-d885-4128-ba2a-0271067ae426"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:48:03.445950 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.445913 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/fb785597-d885-4128-ba2a-0271067ae426-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:03.445950 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.445948 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-pzp44\" (UniqueName: \"kubernetes.io/projected/fb785597-d885-4128-ba2a-0271067ae426-kube-api-access-pzp44\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:03.445950 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.445958 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/fb785597-d885-4128-ba2a-0271067ae426-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:03.669616 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.669585 2581 generic.go:358] "Generic (PLEG): container finished" podID="fb785597-d885-4128-ba2a-0271067ae426" containerID="e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792" exitCode=0 Apr 23 13:48:03.670106 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.669668 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" Apr 23 13:48:03.670106 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.669679 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerDied","Data":"e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792"} Apr 23 13:48:03.670106 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.669726 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r" event={"ID":"fb785597-d885-4128-ba2a-0271067ae426","Type":"ContainerDied","Data":"aad73dfcf8a9c3fbd8d5d30b5ca508ee0c6538463decb1dcf6630275105a2607"} Apr 23 13:48:03.670106 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.669747 2581 scope.go:117] "RemoveContainer" containerID="408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141" Apr 23 13:48:03.678786 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.678765 2581 scope.go:117] "RemoveContainer" containerID="e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792" Apr 23 13:48:03.686412 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.686393 2581 scope.go:117] "RemoveContainer" containerID="fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a" Apr 23 13:48:03.690636 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.690614 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r"] Apr 23 13:48:03.694879 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.694856 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-router-scheduler-76f64dcflx66r"] Apr 23 13:48:03.698365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.698346 2581 scope.go:117] "RemoveContainer" containerID="408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141" Apr 23 13:48:03.698673 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:03.698652 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141\": container with ID starting with 408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141 not found: ID does not exist" containerID="408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141" Apr 23 13:48:03.698760 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.698682 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141"} err="failed to get container status \"408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141\": rpc error: code = NotFound desc = could not find container \"408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141\": container with ID starting with 408c47efe13e94b3818803267944ca37d133945718dc1058a810bd6202ebe141 not found: ID does not exist" Apr 23 13:48:03.698760 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.698702 2581 scope.go:117] "RemoveContainer" containerID="e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792" Apr 23 13:48:03.698953 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:03.698939 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792\": container with ID starting with e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792 not found: ID does not exist" containerID="e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792" Apr 23 13:48:03.698992 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.698957 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792"} err="failed to get container status \"e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792\": rpc error: code = NotFound desc = could not find container \"e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792\": container with ID starting with e8dc6679291c71d01fe4717ebb2c1b542a783f5e0b9106a196807a771517b792 not found: ID does not exist" Apr 23 13:48:03.698992 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.698970 2581 scope.go:117] "RemoveContainer" containerID="fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a" Apr 23 13:48:03.699218 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:03.699192 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a\": container with ID starting with fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a not found: ID does not exist" containerID="fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a" Apr 23 13:48:03.699268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:03.699223 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a"} err="failed to get container status \"fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a\": rpc error: code = NotFound desc = could not find container \"fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a\": container with ID starting with fa0cf73caeeb4690a3058bf9f1a9d3d95c0c6d602dbd8acb1a04a821d746433a not found: ID does not exist" Apr 23 13:48:04.189576 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:04.189538 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" path="/var/lib/kubelet/pods/c2843431-4c69-4a14-8870-2dde178a95c0/volumes" Apr 23 13:48:04.190076 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:04.190056 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="fb785597-d885-4128-ba2a-0271067ae426" path="/var/lib/kubelet/pods/fb785597-d885-4128-ba2a-0271067ae426/volumes" Apr 23 13:48:38.015509 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015468 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42"] Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015840 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="storage-initializer" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015851 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="storage-initializer" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015871 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015877 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015885 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="tokenizer" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015892 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="tokenizer" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015901 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" containerName="main" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015906 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" containerName="main" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015915 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015921 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015927 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" containerName="storage-initializer" Apr 23 13:48:38.015936 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.015934 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" containerName="storage-initializer" Apr 23 13:48:38.016442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.016052 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="tokenizer" Apr 23 13:48:38.016442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.016067 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c2843431-4c69-4a14-8870-2dde178a95c0" containerName="main" Apr 23 13:48:38.016442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.016078 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" Apr 23 13:48:38.016442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.016087 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="fb785597-d885-4128-ba2a-0271067ae426" containerName="main" Apr 23 13:48:38.019964 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.019932 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.022496 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.022468 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-gh6xr\"" Apr 23 13:48:38.022656 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.022523 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 13:48:38.032191 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.032127 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42"] Apr 23 13:48:38.167278 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.167237 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.167482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.167309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.167482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.167331 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.167482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.167349 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z9j7m\" (UniqueName: \"kubernetes.io/projected/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kube-api-access-z9j7m\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.167482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.167374 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.167482 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.167400 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268406 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268305 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268406 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268348 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268406 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268367 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z9j7m\" (UniqueName: \"kubernetes.io/projected/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kube-api-access-z9j7m\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268406 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268392 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268743 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268421 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268743 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268465 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268850 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268786 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268850 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268834 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268935 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268856 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.268935 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.268841 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.271061 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.271041 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.276847 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.276823 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z9j7m\" (UniqueName: \"kubernetes.io/projected/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kube-api-access-z9j7m\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.331966 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.331922 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:38.470970 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.470927 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42"] Apr 23 13:48:38.473259 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:48:38.473226 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc77663b8_c3f4_4fdc_8e08_f7c3f0a4b457.slice/crio-591c8ff11afd54fcb468861f4dffe1ae806ae24ea447ff3da7b8df3ffb01e64a WatchSource:0}: Error finding container 591c8ff11afd54fcb468861f4dffe1ae806ae24ea447ff3da7b8df3ffb01e64a: Status 404 returned error can't find the container with id 591c8ff11afd54fcb468861f4dffe1ae806ae24ea447ff3da7b8df3ffb01e64a Apr 23 13:48:38.796259 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.796102 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerStarted","Data":"6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a"} Apr 23 13:48:38.796259 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:38.796139 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerStarted","Data":"591c8ff11afd54fcb468861f4dffe1ae806ae24ea447ff3da7b8df3ffb01e64a"} Apr 23 13:48:39.803773 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:39.803732 2581 generic.go:358] "Generic (PLEG): container finished" podID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerID="6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a" exitCode=0 Apr 23 13:48:39.804183 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:39.803819 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerDied","Data":"6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a"} Apr 23 13:48:40.809500 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:40.809460 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerStarted","Data":"e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13"} Apr 23 13:48:40.809980 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:40.809507 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerStarted","Data":"84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545"} Apr 23 13:48:40.809980 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:40.809640 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:40.831576 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:40.831516 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" podStartSLOduration=3.831499496 podStartE2EDuration="3.831499496s" podCreationTimestamp="2026-04-23 13:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:48:40.828821797 +0000 UTC m=+1063.100751159" watchObservedRunningTime="2026-04-23 13:48:40.831499496 +0000 UTC m=+1063.103428832" Apr 23 13:48:47.311752 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:47.311714 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t"] Apr 23 13:48:47.312929 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:47.312892 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="main" containerID="cri-o://4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b" gracePeriod=30 Apr 23 13:48:47.313488 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:47.313139 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="tokenizer" containerID="cri-o://e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8" gracePeriod=30 Apr 23 13:48:47.837764 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:47.837731 2581 generic.go:358] "Generic (PLEG): container finished" podID="b1ee408a-f236-4aac-8261-a698bead5f68" containerID="4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b" exitCode=0 Apr 23 13:48:47.837948 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:47.837801 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerDied","Data":"4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b"} Apr 23 13:48:48.332142 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.332100 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:48.332643 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.332186 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:48.334990 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.334962 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:48.663309 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.663286 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:48:48.762676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.762637 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-tmp\") pod \"b1ee408a-f236-4aac-8261-a698bead5f68\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " Apr 23 13:48:48.762676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.762692 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-cache\") pod \"b1ee408a-f236-4aac-8261-a698bead5f68\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " Apr 23 13:48:48.762917 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.762717 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ee408a-f236-4aac-8261-a698bead5f68-tls-certs\") pod \"b1ee408a-f236-4aac-8261-a698bead5f68\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " Apr 23 13:48:48.762917 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.762737 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-kserve-provision-location\") pod \"b1ee408a-f236-4aac-8261-a698bead5f68\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " Apr 23 13:48:48.762917 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.762756 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xcnms\" (UniqueName: \"kubernetes.io/projected/b1ee408a-f236-4aac-8261-a698bead5f68-kube-api-access-xcnms\") pod \"b1ee408a-f236-4aac-8261-a698bead5f68\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " Apr 23 13:48:48.762917 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.762826 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-uds\") pod \"b1ee408a-f236-4aac-8261-a698bead5f68\" (UID: \"b1ee408a-f236-4aac-8261-a698bead5f68\") " Apr 23 13:48:48.763120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.763054 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "b1ee408a-f236-4aac-8261-a698bead5f68" (UID: "b1ee408a-f236-4aac-8261-a698bead5f68"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:48.763120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.763069 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "b1ee408a-f236-4aac-8261-a698bead5f68" (UID: "b1ee408a-f236-4aac-8261-a698bead5f68"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:48.763243 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.763113 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "b1ee408a-f236-4aac-8261-a698bead5f68" (UID: "b1ee408a-f236-4aac-8261-a698bead5f68"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:48.763490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.763468 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b1ee408a-f236-4aac-8261-a698bead5f68" (UID: "b1ee408a-f236-4aac-8261-a698bead5f68"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:48.765040 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.765015 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1ee408a-f236-4aac-8261-a698bead5f68-kube-api-access-xcnms" (OuterVolumeSpecName: "kube-api-access-xcnms") pod "b1ee408a-f236-4aac-8261-a698bead5f68" (UID: "b1ee408a-f236-4aac-8261-a698bead5f68"). InnerVolumeSpecName "kube-api-access-xcnms". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:48:48.765110 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.765095 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1ee408a-f236-4aac-8261-a698bead5f68-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b1ee408a-f236-4aac-8261-a698bead5f68" (UID: "b1ee408a-f236-4aac-8261-a698bead5f68"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:48:48.843191 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.843130 2581 generic.go:358] "Generic (PLEG): container finished" podID="b1ee408a-f236-4aac-8261-a698bead5f68" containerID="e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8" exitCode=0 Apr 23 13:48:48.843365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.843222 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerDied","Data":"e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8"} Apr 23 13:48:48.843365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.843248 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" Apr 23 13:48:48.843365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.843273 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t" event={"ID":"b1ee408a-f236-4aac-8261-a698bead5f68","Type":"ContainerDied","Data":"63782d6f0c9138cf0d2846db483a107c6f0e248bf3389709dd7dfa675eadd69a"} Apr 23 13:48:48.843365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.843294 2581 scope.go:117] "RemoveContainer" containerID="e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8" Apr 23 13:48:48.844785 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.844759 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:48:48.852724 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.852706 2581 scope.go:117] "RemoveContainer" containerID="4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b" Apr 23 13:48:48.861247 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.861222 2581 scope.go:117] "RemoveContainer" containerID="0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9" Apr 23 13:48:48.863470 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.863449 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:48.863470 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.863470 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:48.863609 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.863479 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:48.863609 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.863487 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b1ee408a-f236-4aac-8261-a698bead5f68-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:48.863609 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.863500 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b1ee408a-f236-4aac-8261-a698bead5f68-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:48.863609 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.863508 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xcnms\" (UniqueName: \"kubernetes.io/projected/b1ee408a-f236-4aac-8261-a698bead5f68-kube-api-access-xcnms\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:48:48.869763 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.869744 2581 scope.go:117] "RemoveContainer" containerID="e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8" Apr 23 13:48:48.870041 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:48.870022 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8\": container with ID starting with e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8 not found: ID does not exist" containerID="e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8" Apr 23 13:48:48.870089 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.870055 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8"} err="failed to get container status \"e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8\": rpc error: code = NotFound desc = could not find container \"e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8\": container with ID starting with e9c0b3ef29465ae42d3473f6bedcde5747077328a506918d5040f26762c4f7f8 not found: ID does not exist" Apr 23 13:48:48.870089 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.870074 2581 scope.go:117] "RemoveContainer" containerID="4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b" Apr 23 13:48:48.870327 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:48.870300 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b\": container with ID starting with 4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b not found: ID does not exist" containerID="4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b" Apr 23 13:48:48.870376 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.870324 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b"} err="failed to get container status \"4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b\": rpc error: code = NotFound desc = could not find container \"4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b\": container with ID starting with 4f6c6ef5bf9886a5c7222a602f07e7f00fba83eaa14405983af49ac78f3a8a9b not found: ID does not exist" Apr 23 13:48:48.870376 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.870338 2581 scope.go:117] "RemoveContainer" containerID="0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9" Apr 23 13:48:48.870538 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:48:48.870519 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9\": container with ID starting with 0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9 not found: ID does not exist" containerID="0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9" Apr 23 13:48:48.870597 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.870548 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9"} err="failed to get container status \"0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9\": rpc error: code = NotFound desc = could not find container \"0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9\": container with ID starting with 0c52865ffae7b5efced55d0e2a10a454eefc26a2ac1936066099ed7a251010f9 not found: ID does not exist" Apr 23 13:48:48.883042 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.883011 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t"] Apr 23 13:48:48.889770 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:48.889741 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schegjl8t"] Apr 23 13:48:50.188538 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:50.188505 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" path="/var/lib/kubelet/pods/b1ee408a-f236-4aac-8261-a698bead5f68/volumes" Apr 23 13:48:59.815437 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.815401 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs"] Apr 23 13:48:59.815988 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.815972 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="main" Apr 23 13:48:59.816036 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.815992 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="main" Apr 23 13:48:59.816036 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.816008 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="storage-initializer" Apr 23 13:48:59.816036 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.816016 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="storage-initializer" Apr 23 13:48:59.816142 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.816045 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="tokenizer" Apr 23 13:48:59.816142 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.816053 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="tokenizer" Apr 23 13:48:59.816254 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.816145 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="tokenizer" Apr 23 13:48:59.816254 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.816189 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b1ee408a-f236-4aac-8261-a698bead5f68" containerName="main" Apr 23 13:48:59.822004 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.821971 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.824691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.824670 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-dvz9n\"" Apr 23 13:48:59.824826 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.824715 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 23 13:48:59.828342 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.828314 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs"] Apr 23 13:48:59.865553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.865517 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nhskl\" (UniqueName: \"kubernetes.io/projected/154ce61e-d265-42bf-8576-86a0a1f0e7af-kube-api-access-nhskl\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.865719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.865601 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.865719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.865650 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.865719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.865676 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.865719 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.865701 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/154ce61e-d265-42bf-8576-86a0a1f0e7af-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.865860 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.865725 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967073 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967029 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/154ce61e-d265-42bf-8576-86a0a1f0e7af-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967073 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967073 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967133 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nhskl\" (UniqueName: \"kubernetes.io/projected/154ce61e-d265-42bf-8576-86a0a1f0e7af-kube-api-access-nhskl\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967196 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967222 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967252 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967608 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967576 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967727 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967638 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967727 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967586 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.967727 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.967687 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.969904 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.969882 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/154ce61e-d265-42bf-8576-86a0a1f0e7af-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:48:59.977102 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:48:59.977076 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nhskl\" (UniqueName: \"kubernetes.io/projected/154ce61e-d265-42bf-8576-86a0a1f0e7af-kube-api-access-nhskl\") pod \"custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:00.133358 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:00.133316 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:00.265822 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:00.265792 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs"] Apr 23 13:49:00.267511 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:49:00.267482 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod154ce61e_d265_42bf_8576_86a0a1f0e7af.slice/crio-cfca8d55d8bc9ff5b553f8966d8e9db462f9b88acaf32f3da5b25c681deac654 WatchSource:0}: Error finding container cfca8d55d8bc9ff5b553f8966d8e9db462f9b88acaf32f3da5b25c681deac654: Status 404 returned error can't find the container with id cfca8d55d8bc9ff5b553f8966d8e9db462f9b88acaf32f3da5b25c681deac654 Apr 23 13:49:00.895029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:00.894993 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerStarted","Data":"6d70d050a5d5a09c6e2deaf4154b21fcefc5b4fd91d948483a3f23d9cbee9398"} Apr 23 13:49:00.895029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:00.895030 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerStarted","Data":"cfca8d55d8bc9ff5b553f8966d8e9db462f9b88acaf32f3da5b25c681deac654"} Apr 23 13:49:01.900490 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:01.900452 2581 generic.go:358] "Generic (PLEG): container finished" podID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerID="6d70d050a5d5a09c6e2deaf4154b21fcefc5b4fd91d948483a3f23d9cbee9398" exitCode=0 Apr 23 13:49:01.900897 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:01.900544 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerDied","Data":"6d70d050a5d5a09c6e2deaf4154b21fcefc5b4fd91d948483a3f23d9cbee9398"} Apr 23 13:49:02.906333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:02.906300 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerStarted","Data":"6810c0eba6bf7d59c95c589e4767696025c344f8a80b04fdcc1676ab7edda841"} Apr 23 13:49:02.906333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:02.906336 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerStarted","Data":"f899d72dc1295af5cca02f890304c1905d0769a6c2a50eeec387691e9c45e37d"} Apr 23 13:49:02.906778 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:02.906415 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:02.927351 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:02.927294 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" podStartSLOduration=3.927274612 podStartE2EDuration="3.927274612s" podCreationTimestamp="2026-04-23 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:49:02.926358633 +0000 UTC m=+1085.198287966" watchObservedRunningTime="2026-04-23 13:49:02.927274612 +0000 UTC m=+1085.199203950" Apr 23 13:49:09.849877 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:09.849847 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:49:10.134195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:10.134136 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:10.134195 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:10.134196 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:10.136832 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:10.136810 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:10.939793 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:10.939765 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:49:31.943450 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:49:31.943419 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:50:57.844550 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:57.844517 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42"] Apr 23 13:50:57.845046 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:57.844818 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="main" containerID="cri-o://84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545" gracePeriod=30 Apr 23 13:50:57.845046 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:57.844859 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="tokenizer" containerID="cri-o://e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13" gracePeriod=30 Apr 23 13:50:58.237003 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:58.236974 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:50:58.241332 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:58.241302 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:50:58.326457 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:58.326417 2581 generic.go:358] "Generic (PLEG): container finished" podID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerID="84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545" exitCode=0 Apr 23 13:50:58.326630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:58.326487 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerDied","Data":"84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545"} Apr 23 13:50:58.844473 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:58.844423 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.39:8082/healthz\": dial tcp 10.132.0.39:8082: connect: connection refused" Apr 23 13:50:59.189921 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.189891 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:50:59.257117 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257087 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kserve-provision-location\") pod \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " Apr 23 13:50:59.257311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257135 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-tmp\") pod \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " Apr 23 13:50:59.257311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257180 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z9j7m\" (UniqueName: \"kubernetes.io/projected/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kube-api-access-z9j7m\") pod \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " Apr 23 13:50:59.257311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257227 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-cache\") pod \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " Apr 23 13:50:59.257311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257267 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tls-certs\") pod \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " Apr 23 13:50:59.257311 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257304 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-uds\") pod \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\" (UID: \"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457\") " Apr 23 13:50:59.257565 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257491 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" (UID: "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:50:59.257565 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257517 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" (UID: "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:50:59.257669 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257590 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:50:59.257669 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257637 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:50:59.257760 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257653 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" (UID: "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:50:59.257911 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.257889 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" (UID: "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:50:59.259519 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.259497 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" (UID: "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:50:59.259612 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.259576 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kube-api-access-z9j7m" (OuterVolumeSpecName: "kube-api-access-z9j7m") pod "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" (UID: "c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457"). InnerVolumeSpecName "kube-api-access-z9j7m". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:50:59.331904 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.331869 2581 generic.go:358] "Generic (PLEG): container finished" podID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerID="e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13" exitCode=0 Apr 23 13:50:59.332100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.331945 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" Apr 23 13:50:59.332100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.331956 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerDied","Data":"e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13"} Apr 23 13:50:59.332100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.331994 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42" event={"ID":"c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457","Type":"ContainerDied","Data":"591c8ff11afd54fcb468861f4dffe1ae806ae24ea447ff3da7b8df3ffb01e64a"} Apr 23 13:50:59.332100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.332021 2581 scope.go:117] "RemoveContainer" containerID="e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13" Apr 23 13:50:59.340940 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.340914 2581 scope.go:117] "RemoveContainer" containerID="84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545" Apr 23 13:50:59.348605 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.348588 2581 scope.go:117] "RemoveContainer" containerID="6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a" Apr 23 13:50:59.354459 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.354436 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42"] Apr 23 13:50:59.357095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.357068 2581 scope.go:117] "RemoveContainer" containerID="e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13" Apr 23 13:50:59.357487 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:50:59.357464 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13\": container with ID starting with e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13 not found: ID does not exist" containerID="e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13" Apr 23 13:50:59.357606 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.357498 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13"} err="failed to get container status \"e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13\": rpc error: code = NotFound desc = could not find container \"e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13\": container with ID starting with e989702cbd1888521b7fc18cc61c6b0450f831c6bedfa4fc318f6a97c53fdc13 not found: ID does not exist" Apr 23 13:50:59.357606 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.357522 2581 scope.go:117] "RemoveContainer" containerID="84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545" Apr 23 13:50:59.357829 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:50:59.357806 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545\": container with ID starting with 84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545 not found: ID does not exist" containerID="84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545" Apr 23 13:50:59.357918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.357837 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545"} err="failed to get container status \"84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545\": rpc error: code = NotFound desc = could not find container \"84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545\": container with ID starting with 84968fcc18f9b99ea639407f382a8ea807eb8a4eb919b31236f2637f57cae545 not found: ID does not exist" Apr 23 13:50:59.357918 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.357857 2581 scope.go:117] "RemoveContainer" containerID="6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a" Apr 23 13:50:59.358134 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:50:59.358108 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a\": container with ID starting with 6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a not found: ID does not exist" containerID="6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a" Apr 23 13:50:59.358211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.358140 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:50:59.358211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.358145 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a"} err="failed to get container status \"6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a\": rpc error: code = NotFound desc = could not find container \"6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a\": container with ID starting with 6f11cd576cd678645b78e345d8dacfe7a3917a499bf695437c88a82ac0f6e63a not found: ID does not exist" Apr 23 13:50:59.358211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.358185 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:50:59.358211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.358196 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:50:59.358211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.358205 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z9j7m\" (UniqueName: \"kubernetes.io/projected/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457-kube-api-access-z9j7m\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:50:59.358397 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:50:59.358301 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-ncb42"] Apr 23 13:51:00.101692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.101658 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs"] Apr 23 13:51:00.102015 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.101984 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="main" containerID="cri-o://f899d72dc1295af5cca02f890304c1905d0769a6c2a50eeec387691e9c45e37d" gracePeriod=30 Apr 23 13:51:00.102207 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.102027 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="tokenizer" containerID="cri-o://6810c0eba6bf7d59c95c589e4767696025c344f8a80b04fdcc1676ab7edda841" gracePeriod=30 Apr 23 13:51:00.188430 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.188393 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" path="/var/lib/kubelet/pods/c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457/volumes" Apr 23 13:51:00.338862 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.338823 2581 generic.go:358] "Generic (PLEG): container finished" podID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerID="f899d72dc1295af5cca02f890304c1905d0769a6c2a50eeec387691e9c45e37d" exitCode=0 Apr 23 13:51:00.339397 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.338876 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerDied","Data":"f899d72dc1295af5cca02f890304c1905d0769a6c2a50eeec387691e9c45e37d"} Apr 23 13:51:00.939093 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:00.939036 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.40:8082/healthz\": dial tcp 10.132.0.40:8082: connect: connection refused" Apr 23 13:51:01.345120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.345074 2581 generic.go:358] "Generic (PLEG): container finished" podID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerID="6810c0eba6bf7d59c95c589e4767696025c344f8a80b04fdcc1676ab7edda841" exitCode=0 Apr 23 13:51:01.345614 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.345144 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerDied","Data":"6810c0eba6bf7d59c95c589e4767696025c344f8a80b04fdcc1676ab7edda841"} Apr 23 13:51:01.462581 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.462553 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:51:01.578081 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.577969 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-tmp\") pod \"154ce61e-d265-42bf-8576-86a0a1f0e7af\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " Apr 23 13:51:01.578081 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578044 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-cache\") pod \"154ce61e-d265-42bf-8576-86a0a1f0e7af\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " Apr 23 13:51:01.578317 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578119 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-kserve-provision-location\") pod \"154ce61e-d265-42bf-8576-86a0a1f0e7af\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " Apr 23 13:51:01.578317 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578228 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-uds\") pod \"154ce61e-d265-42bf-8576-86a0a1f0e7af\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " Apr 23 13:51:01.578317 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578291 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/154ce61e-d265-42bf-8576-86a0a1f0e7af-tls-certs\") pod \"154ce61e-d265-42bf-8576-86a0a1f0e7af\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " Apr 23 13:51:01.578497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578347 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nhskl\" (UniqueName: \"kubernetes.io/projected/154ce61e-d265-42bf-8576-86a0a1f0e7af-kube-api-access-nhskl\") pod \"154ce61e-d265-42bf-8576-86a0a1f0e7af\" (UID: \"154ce61e-d265-42bf-8576-86a0a1f0e7af\") " Apr 23 13:51:01.578497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578418 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "154ce61e-d265-42bf-8576-86a0a1f0e7af" (UID: "154ce61e-d265-42bf-8576-86a0a1f0e7af"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:01.578497 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578448 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "154ce61e-d265-42bf-8576-86a0a1f0e7af" (UID: "154ce61e-d265-42bf-8576-86a0a1f0e7af"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:01.578644 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578598 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:51:01.578644 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578615 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:51:01.578768 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.578730 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "154ce61e-d265-42bf-8576-86a0a1f0e7af" (UID: "154ce61e-d265-42bf-8576-86a0a1f0e7af"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:01.579142 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.579115 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "154ce61e-d265-42bf-8576-86a0a1f0e7af" (UID: "154ce61e-d265-42bf-8576-86a0a1f0e7af"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:01.580740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.580710 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/154ce61e-d265-42bf-8576-86a0a1f0e7af-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "154ce61e-d265-42bf-8576-86a0a1f0e7af" (UID: "154ce61e-d265-42bf-8576-86a0a1f0e7af"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:01.580834 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.580742 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/154ce61e-d265-42bf-8576-86a0a1f0e7af-kube-api-access-nhskl" (OuterVolumeSpecName: "kube-api-access-nhskl") pod "154ce61e-d265-42bf-8576-86a0a1f0e7af" (UID: "154ce61e-d265-42bf-8576-86a0a1f0e7af"). InnerVolumeSpecName "kube-api-access-nhskl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:01.679555 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.679518 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nhskl\" (UniqueName: \"kubernetes.io/projected/154ce61e-d265-42bf-8576-86a0a1f0e7af-kube-api-access-nhskl\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:51:01.679555 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.679549 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:51:01.679555 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.679562 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/154ce61e-d265-42bf-8576-86a0a1f0e7af-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:51:01.679798 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:01.679573 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/154ce61e-d265-42bf-8576-86a0a1f0e7af-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:51:02.350614 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.350576 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" event={"ID":"154ce61e-d265-42bf-8576-86a0a1f0e7af","Type":"ContainerDied","Data":"cfca8d55d8bc9ff5b553f8966d8e9db462f9b88acaf32f3da5b25c681deac654"} Apr 23 13:51:02.350614 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.350622 2581 scope.go:117] "RemoveContainer" containerID="6810c0eba6bf7d59c95c589e4767696025c344f8a80b04fdcc1676ab7edda841" Apr 23 13:51:02.351085 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.350630 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs" Apr 23 13:51:02.358985 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.358958 2581 scope.go:117] "RemoveContainer" containerID="f899d72dc1295af5cca02f890304c1905d0769a6c2a50eeec387691e9c45e37d" Apr 23 13:51:02.366613 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.366582 2581 scope.go:117] "RemoveContainer" containerID="6d70d050a5d5a09c6e2deaf4154b21fcefc5b4fd91d948483a3f23d9cbee9398" Apr 23 13:51:02.370381 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.370358 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs"] Apr 23 13:51:02.377033 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:02.377004 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-6dc456c46hngs"] Apr 23 13:51:04.188920 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:04.188881 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" path="/var/lib/kubelet/pods/154ce61e-d265-42bf-8576-86a0a1f0e7af/volumes" Apr 23 13:51:08.469299 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469244 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f"] Apr 23 13:51:08.469819 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469801 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="tokenizer" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469821 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="tokenizer" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469834 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="storage-initializer" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469840 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="storage-initializer" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469848 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="tokenizer" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469854 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="tokenizer" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469867 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="main" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469872 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="main" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469885 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="main" Apr 23 13:51:08.469885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469890 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="main" Apr 23 13:51:08.470393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469896 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="storage-initializer" Apr 23 13:51:08.470393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469901 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="storage-initializer" Apr 23 13:51:08.470393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469957 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="tokenizer" Apr 23 13:51:08.470393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469964 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="tokenizer" Apr 23 13:51:08.470393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469973 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="154ce61e-d265-42bf-8576-86a0a1f0e7af" containerName="main" Apr 23 13:51:08.470393 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.469980 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="c77663b8-c3f4-4fdc-8e08-f7c3f0a4b457" containerName="main" Apr 23 13:51:08.475079 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.475059 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.477819 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.477796 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:51:08.479053 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.479029 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 23 13:51:08.479053 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.479045 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:51:08.479267 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.479046 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:51:08.479267 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.479046 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-vvfp4\"" Apr 23 13:51:08.485402 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.485380 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f"] Apr 23 13:51:08.540987 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.540953 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.541199 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.540995 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.541199 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.541072 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.541199 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.541125 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ks2rq\" (UniqueName: \"kubernetes.io/projected/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kube-api-access-ks2rq\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.541375 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.541213 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.541375 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.541263 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.642621 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.642581 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.642824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.642634 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.642824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.642670 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.642824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.642687 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.642824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.642717 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.642824 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.642745 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ks2rq\" (UniqueName: \"kubernetes.io/projected/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kube-api-access-ks2rq\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.643092 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.643051 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.643174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.643103 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.643174 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.643133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.643252 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.643191 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.645355 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.645335 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.650775 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.650750 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ks2rq\" (UniqueName: \"kubernetes.io/projected/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kube-api-access-ks2rq\") pod \"router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.784763 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.784668 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:08.917922 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.917897 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f"] Apr 23 13:51:08.920336 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:51:08.920303 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod7bb736cc_48a1_4dd9_ac95_f7466ad3ecfe.slice/crio-45132eae2cf5fce76465f6fe6641d98c8b2f629c50ca062b6d1dadd1c58ec542 WatchSource:0}: Error finding container 45132eae2cf5fce76465f6fe6641d98c8b2f629c50ca062b6d1dadd1c58ec542: Status 404 returned error can't find the container with id 45132eae2cf5fce76465f6fe6641d98c8b2f629c50ca062b6d1dadd1c58ec542 Apr 23 13:51:08.922216 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:08.922197 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:51:09.381129 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:09.381091 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerStarted","Data":"fd5a284e23ed05c9dbeeba7bde9560c014540fa175e2c7ad8d8b173a0a6f10cf"} Apr 23 13:51:09.381129 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:09.381133 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerStarted","Data":"45132eae2cf5fce76465f6fe6641d98c8b2f629c50ca062b6d1dadd1c58ec542"} Apr 23 13:51:10.385925 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:10.385891 2581 generic.go:358] "Generic (PLEG): container finished" podID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerID="fd5a284e23ed05c9dbeeba7bde9560c014540fa175e2c7ad8d8b173a0a6f10cf" exitCode=0 Apr 23 13:51:10.386369 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:10.385996 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerDied","Data":"fd5a284e23ed05c9dbeeba7bde9560c014540fa175e2c7ad8d8b173a0a6f10cf"} Apr 23 13:51:11.391758 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:11.391718 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerStarted","Data":"f39523cc328198270c7774dfa091eb5420f5c48ff398605dfb82e5a6937e03e9"} Apr 23 13:51:11.392189 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:11.391767 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerStarted","Data":"a101b74ad05a0d597162ba369e0f3d3d309ab5a90f113b9a397671c1c3fc75c1"} Apr 23 13:51:11.392189 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:11.391837 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:11.420986 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:11.420920 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" podStartSLOduration=3.420904294 podStartE2EDuration="3.420904294s" podCreationTimestamp="2026-04-23 13:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:51:11.41823876 +0000 UTC m=+1213.690168095" watchObservedRunningTime="2026-04-23 13:51:11.420904294 +0000 UTC m=+1213.692833627" Apr 23 13:51:18.785541 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:18.785484 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:18.785541 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:18.785555 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:18.788463 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:18.788437 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:19.422050 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:19.422020 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:23.559007 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.558974 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s"] Apr 23 13:51:23.563348 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.563326 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.566075 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.566051 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-xrbzn\"" Apr 23 13:51:23.566814 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.566796 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 13:51:23.573835 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.573814 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s"] Apr 23 13:51:23.689308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.689270 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.689308 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.689309 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.689511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.689347 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xrn8v\" (UniqueName: \"kubernetes.io/projected/8468f591-62c9-4d0b-85da-66f8dc5ff968-kube-api-access-xrn8v\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.689511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.689384 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.689511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.689426 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8468f591-62c9-4d0b-85da-66f8dc5ff968-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.689511 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.689446 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.790729 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.790693 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.790729 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.790725 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.790968 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.790752 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xrn8v\" (UniqueName: \"kubernetes.io/projected/8468f591-62c9-4d0b-85da-66f8dc5ff968-kube-api-access-xrn8v\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.790968 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.790869 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.790968 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.790956 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8468f591-62c9-4d0b-85da-66f8dc5ff968-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.791085 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.790997 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.791131 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.791120 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.791201 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.791185 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.791273 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.791254 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.791338 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.791323 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.793575 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.793554 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8468f591-62c9-4d0b-85da-66f8dc5ff968-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.799951 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.799928 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xrn8v\" (UniqueName: \"kubernetes.io/projected/8468f591-62c9-4d0b-85da-66f8dc5ff968-kube-api-access-xrn8v\") pod \"stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:23.873594 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:23.873546 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:24.009221 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:24.009194 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s"] Apr 23 13:51:24.011129 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:51:24.011092 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8468f591_62c9_4d0b_85da_66f8dc5ff968.slice/crio-60691a494f700d54226a5ee2dd224da90e734ed2f432916aa540f9a69d3e5115 WatchSource:0}: Error finding container 60691a494f700d54226a5ee2dd224da90e734ed2f432916aa540f9a69d3e5115: Status 404 returned error can't find the container with id 60691a494f700d54226a5ee2dd224da90e734ed2f432916aa540f9a69d3e5115 Apr 23 13:51:24.439692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:24.439653 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerStarted","Data":"9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee"} Apr 23 13:51:24.439692 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:24.439695 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerStarted","Data":"60691a494f700d54226a5ee2dd224da90e734ed2f432916aa540f9a69d3e5115"} Apr 23 13:51:25.444608 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:25.444573 2581 generic.go:358] "Generic (PLEG): container finished" podID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerID="9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee" exitCode=0 Apr 23 13:51:25.445050 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:25.444629 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerDied","Data":"9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee"} Apr 23 13:51:26.450797 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:26.450756 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerStarted","Data":"e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba"} Apr 23 13:51:26.450797 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:26.450798 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerStarted","Data":"82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600"} Apr 23 13:51:26.451418 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:26.450865 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:26.475498 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:26.475446 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" podStartSLOduration=3.475429816 podStartE2EDuration="3.475429816s" podCreationTimestamp="2026-04-23 13:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:51:26.472705598 +0000 UTC m=+1228.744634944" watchObservedRunningTime="2026-04-23 13:51:26.475429816 +0000 UTC m=+1228.747359150" Apr 23 13:51:33.873978 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:33.873936 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:33.874519 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:33.874085 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:33.876894 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:33.876874 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:34.482407 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:34.482369 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:51:40.425598 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:40.425524 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:51:56.490141 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:51:56.490109 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:53:17.522564 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:17.522475 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f"] Apr 23 13:53:17.523118 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:17.522890 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="main" containerID="cri-o://a101b74ad05a0d597162ba369e0f3d3d309ab5a90f113b9a397671c1c3fc75c1" gracePeriod=30 Apr 23 13:53:17.523118 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:17.522914 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="tokenizer" containerID="cri-o://f39523cc328198270c7774dfa091eb5420f5c48ff398605dfb82e5a6937e03e9" gracePeriod=30 Apr 23 13:53:17.856562 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:17.856532 2581 generic.go:358] "Generic (PLEG): container finished" podID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerID="a101b74ad05a0d597162ba369e0f3d3d309ab5a90f113b9a397671c1c3fc75c1" exitCode=0 Apr 23 13:53:17.856730 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:17.856606 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerDied","Data":"a101b74ad05a0d597162ba369e0f3d3d309ab5a90f113b9a397671c1c3fc75c1"} Apr 23 13:53:18.861569 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.861529 2581 generic.go:358] "Generic (PLEG): container finished" podID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerID="f39523cc328198270c7774dfa091eb5420f5c48ff398605dfb82e5a6937e03e9" exitCode=0 Apr 23 13:53:18.861835 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.861561 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerDied","Data":"f39523cc328198270c7774dfa091eb5420f5c48ff398605dfb82e5a6937e03e9"} Apr 23 13:53:18.861835 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.861600 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" event={"ID":"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe","Type":"ContainerDied","Data":"45132eae2cf5fce76465f6fe6641d98c8b2f629c50ca062b6d1dadd1c58ec542"} Apr 23 13:53:18.861835 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.861615 2581 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="45132eae2cf5fce76465f6fe6641d98c8b2f629c50ca062b6d1dadd1c58ec542" Apr 23 13:53:18.875272 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.875248 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:53:18.936553 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936521 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kserve-provision-location\") pod \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " Apr 23 13:53:18.936808 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936569 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-uds\") pod \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " Apr 23 13:53:18.936808 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936590 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-tmp\") pod \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " Apr 23 13:53:18.936808 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936619 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-cache\") pod \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " Apr 23 13:53:18.936808 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936657 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ks2rq\" (UniqueName: \"kubernetes.io/projected/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kube-api-access-ks2rq\") pod \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " Apr 23 13:53:18.936808 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936738 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tls-certs\") pod \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\" (UID: \"7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe\") " Apr 23 13:53:18.937060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936862 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" (UID: "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:18.937060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936914 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" (UID: "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:18.937060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.936982 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" (UID: "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:18.937060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.937005 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:18.937060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.937024 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:18.937349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.937329 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" (UID: "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:18.939063 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.939032 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kube-api-access-ks2rq" (OuterVolumeSpecName: "kube-api-access-ks2rq") pod "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" (UID: "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe"). InnerVolumeSpecName "kube-api-access-ks2rq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:18.939203 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:18.939060 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" (UID: "7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:19.038499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.038462 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:19.038499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.038495 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:19.038499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.038505 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:19.038755 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.038516 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ks2rq\" (UniqueName: \"kubernetes.io/projected/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe-kube-api-access-ks2rq\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:19.865117 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.865080 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f" Apr 23 13:53:19.888201 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.888169 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f"] Apr 23 13:53:19.891012 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:19.890986 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-5d4cfb9d89-5qv5f"] Apr 23 13:53:20.188931 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:20.188855 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" path="/var/lib/kubelet/pods/7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe/volumes" Apr 23 13:53:33.905959 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.905917 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9"] Apr 23 13:53:33.906572 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906540 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="main" Apr 23 13:53:33.906572 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906561 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="main" Apr 23 13:53:33.906691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906576 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="tokenizer" Apr 23 13:53:33.906691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906585 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="tokenizer" Apr 23 13:53:33.906691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906619 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="storage-initializer" Apr 23 13:53:33.906691 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906628 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="storage-initializer" Apr 23 13:53:33.906899 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906734 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="main" Apr 23 13:53:33.906899 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.906751 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="7bb736cc-48a1-4dd9-ac95-f7466ad3ecfe" containerName="tokenizer" Apr 23 13:53:33.912146 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.912120 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:33.914442 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.914412 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-ltxj6\"" Apr 23 13:53:33.914586 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.914513 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 23 13:53:33.919994 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.919969 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9"] Apr 23 13:53:33.976664 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.976615 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:33.976664 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.976667 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:33.976885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.976718 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8pp54\" (UniqueName: \"kubernetes.io/projected/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kube-api-access-8pp54\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:33.976885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.976823 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:33.977002 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.976901 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:33.977002 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:33.976926 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.077885 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.077844 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078088 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.077894 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078088 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.077993 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078088 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078027 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078088 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078063 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8pp54\" (UniqueName: \"kubernetes.io/projected/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kube-api-access-8pp54\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078360 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078132 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078360 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078345 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078447 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078386 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078447 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078411 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.078554 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.078531 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.080676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.080653 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.086727 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.086706 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8pp54\" (UniqueName: \"kubernetes.io/projected/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kube-api-access-8pp54\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.224329 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.224249 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:34.360513 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.360477 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9"] Apr 23 13:53:34.363693 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:53:34.363665 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5db3d074_e1b7_4d3d_bdae_49e38aad0635.slice/crio-b09840eb8312c193369f0a8a0e733ef9bf708291e2573c1a9dfb5ba92dcdebed WatchSource:0}: Error finding container b09840eb8312c193369f0a8a0e733ef9bf708291e2573c1a9dfb5ba92dcdebed: Status 404 returned error can't find the container with id b09840eb8312c193369f0a8a0e733ef9bf708291e2573c1a9dfb5ba92dcdebed Apr 23 13:53:34.920776 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.920728 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerStarted","Data":"74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348"} Apr 23 13:53:34.920776 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:34.920775 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerStarted","Data":"b09840eb8312c193369f0a8a0e733ef9bf708291e2573c1a9dfb5ba92dcdebed"} Apr 23 13:53:35.925928 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:35.925887 2581 generic.go:358] "Generic (PLEG): container finished" podID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerID="74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348" exitCode=0 Apr 23 13:53:35.926348 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:35.925949 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerDied","Data":"74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348"} Apr 23 13:53:36.931781 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:36.931741 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerStarted","Data":"73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd"} Apr 23 13:53:36.932341 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:36.931789 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerStarted","Data":"30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4"} Apr 23 13:53:36.932341 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:36.931893 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:36.955762 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:36.955701 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" podStartSLOduration=3.955684701 podStartE2EDuration="3.955684701s" podCreationTimestamp="2026-04-23 13:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:53:36.95351834 +0000 UTC m=+1359.225447675" watchObservedRunningTime="2026-04-23 13:53:36.955684701 +0000 UTC m=+1359.227614035" Apr 23 13:53:44.224450 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.224408 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:44.225024 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.224515 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:44.227450 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.227422 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:44.241403 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.241375 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s"] Apr 23 13:53:44.241723 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.241701 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="main" containerID="cri-o://82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600" gracePeriod=30 Apr 23 13:53:44.241775 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.241722 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="tokenizer" containerID="cri-o://e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba" gracePeriod=30 Apr 23 13:53:44.482231 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.482123 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.42:8082/healthz\": dial tcp 10.132.0.42:8082: connect: connection refused" Apr 23 13:53:44.964066 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.964029 2581 generic.go:358] "Generic (PLEG): container finished" podID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerID="82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600" exitCode=0 Apr 23 13:53:44.964293 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.964121 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerDied","Data":"82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600"} Apr 23 13:53:44.965684 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:44.965661 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:53:45.599302 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.599279 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:53:45.688029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.687942 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-uds\") pod \"8468f591-62c9-4d0b-85da-66f8dc5ff968\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " Apr 23 13:53:45.688029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.687978 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8468f591-62c9-4d0b-85da-66f8dc5ff968-tls-certs\") pod \"8468f591-62c9-4d0b-85da-66f8dc5ff968\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " Apr 23 13:53:45.688029 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688018 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xrn8v\" (UniqueName: \"kubernetes.io/projected/8468f591-62c9-4d0b-85da-66f8dc5ff968-kube-api-access-xrn8v\") pod \"8468f591-62c9-4d0b-85da-66f8dc5ff968\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " Apr 23 13:53:45.688349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688053 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-tmp\") pod \"8468f591-62c9-4d0b-85da-66f8dc5ff968\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " Apr 23 13:53:45.688349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688113 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-cache\") pod \"8468f591-62c9-4d0b-85da-66f8dc5ff968\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " Apr 23 13:53:45.688349 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688189 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-kserve-provision-location\") pod \"8468f591-62c9-4d0b-85da-66f8dc5ff968\" (UID: \"8468f591-62c9-4d0b-85da-66f8dc5ff968\") " Apr 23 13:53:45.688521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688339 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8468f591-62c9-4d0b-85da-66f8dc5ff968" (UID: "8468f591-62c9-4d0b-85da-66f8dc5ff968"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:45.688521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688444 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.688521 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688451 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8468f591-62c9-4d0b-85da-66f8dc5ff968" (UID: "8468f591-62c9-4d0b-85da-66f8dc5ff968"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:45.688777 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.688742 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8468f591-62c9-4d0b-85da-66f8dc5ff968" (UID: "8468f591-62c9-4d0b-85da-66f8dc5ff968"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:45.689049 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.689030 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8468f591-62c9-4d0b-85da-66f8dc5ff968" (UID: "8468f591-62c9-4d0b-85da-66f8dc5ff968"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:45.690351 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.690331 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8468f591-62c9-4d0b-85da-66f8dc5ff968-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8468f591-62c9-4d0b-85da-66f8dc5ff968" (UID: "8468f591-62c9-4d0b-85da-66f8dc5ff968"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:45.690440 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.690345 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8468f591-62c9-4d0b-85da-66f8dc5ff968-kube-api-access-xrn8v" (OuterVolumeSpecName: "kube-api-access-xrn8v") pod "8468f591-62c9-4d0b-85da-66f8dc5ff968" (UID: "8468f591-62c9-4d0b-85da-66f8dc5ff968"). InnerVolumeSpecName "kube-api-access-xrn8v". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:45.788967 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.788925 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.788967 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.788959 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8468f591-62c9-4d0b-85da-66f8dc5ff968-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.788967 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.788972 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xrn8v\" (UniqueName: \"kubernetes.io/projected/8468f591-62c9-4d0b-85da-66f8dc5ff968-kube-api-access-xrn8v\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.789268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.788982 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.789268 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.788991 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8468f591-62c9-4d0b-85da-66f8dc5ff968-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:53:45.970829 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.970729 2581 generic.go:358] "Generic (PLEG): container finished" podID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerID="e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba" exitCode=0 Apr 23 13:53:45.970829 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.970804 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" Apr 23 13:53:45.970829 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.970818 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerDied","Data":"e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba"} Apr 23 13:53:45.971069 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.970853 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s" event={"ID":"8468f591-62c9-4d0b-85da-66f8dc5ff968","Type":"ContainerDied","Data":"60691a494f700d54226a5ee2dd224da90e734ed2f432916aa540f9a69d3e5115"} Apr 23 13:53:45.971069 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.970869 2581 scope.go:117] "RemoveContainer" containerID="e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba" Apr 23 13:53:45.980224 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.980205 2581 scope.go:117] "RemoveContainer" containerID="82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600" Apr 23 13:53:45.988408 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.988387 2581 scope.go:117] "RemoveContainer" containerID="9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee" Apr 23 13:53:45.993100 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.993075 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s"] Apr 23 13:53:45.996728 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.996698 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-7bc7d85655-srg9s"] Apr 23 13:53:45.998456 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.998435 2581 scope.go:117] "RemoveContainer" containerID="e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba" Apr 23 13:53:45.998737 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:53:45.998717 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba\": container with ID starting with e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba not found: ID does not exist" containerID="e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba" Apr 23 13:53:45.998794 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.998751 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba"} err="failed to get container status \"e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba\": rpc error: code = NotFound desc = could not find container \"e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba\": container with ID starting with e0d707ff8939bc2ac930ef55ca0fabbd270afb036aa320683e6b8511f76e48ba not found: ID does not exist" Apr 23 13:53:45.998794 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.998773 2581 scope.go:117] "RemoveContainer" containerID="82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600" Apr 23 13:53:45.999057 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:53:45.999038 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600\": container with ID starting with 82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600 not found: ID does not exist" containerID="82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600" Apr 23 13:53:45.999113 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.999065 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600"} err="failed to get container status \"82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600\": rpc error: code = NotFound desc = could not find container \"82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600\": container with ID starting with 82868d5d654ff20ca27932bb16a975b290e9de064b34b43f8f2d9bd9a6778600 not found: ID does not exist" Apr 23 13:53:45.999113 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.999081 2581 scope.go:117] "RemoveContainer" containerID="9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee" Apr 23 13:53:45.999356 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:53:45.999337 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee\": container with ID starting with 9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee not found: ID does not exist" containerID="9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee" Apr 23 13:53:45.999414 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:45.999361 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee"} err="failed to get container status \"9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee\": rpc error: code = NotFound desc = could not find container \"9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee\": container with ID starting with 9e7db2e8fb869556472a3075704e1f20c2206135ea168d07a957bd5732d9b9ee not found: ID does not exist" Apr 23 13:53:46.189722 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:53:46.189682 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" path="/var/lib/kubelet/pods/8468f591-62c9-4d0b-85da-66f8dc5ff968/volumes" Apr 23 13:54:05.973807 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:05.973776 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:54:15.036693 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.036660 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8"] Apr 23 13:54:15.037087 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.036901 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" podUID="b12daf06-e20e-4c18-b2f5-a567db50fa8f" containerName="manager" containerID="cri-o://956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557" gracePeriod=30 Apr 23 13:54:15.283357 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.283332 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:54:15.336697 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.336622 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12daf06-e20e-4c18-b2f5-a567db50fa8f-cert\") pod \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " Apr 23 13:54:15.336851 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.336752 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-f57c9\" (UniqueName: \"kubernetes.io/projected/b12daf06-e20e-4c18-b2f5-a567db50fa8f-kube-api-access-f57c9\") pod \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\" (UID: \"b12daf06-e20e-4c18-b2f5-a567db50fa8f\") " Apr 23 13:54:15.338941 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.338899 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b12daf06-e20e-4c18-b2f5-a567db50fa8f-cert" (OuterVolumeSpecName: "cert") pod "b12daf06-e20e-4c18-b2f5-a567db50fa8f" (UID: "b12daf06-e20e-4c18-b2f5-a567db50fa8f"). InnerVolumeSpecName "cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:15.339066 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.338950 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b12daf06-e20e-4c18-b2f5-a567db50fa8f-kube-api-access-f57c9" (OuterVolumeSpecName: "kube-api-access-f57c9") pod "b12daf06-e20e-4c18-b2f5-a567db50fa8f" (UID: "b12daf06-e20e-4c18-b2f5-a567db50fa8f"). InnerVolumeSpecName "kube-api-access-f57c9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:15.437964 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.437921 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-f57c9\" (UniqueName: \"kubernetes.io/projected/b12daf06-e20e-4c18-b2f5-a567db50fa8f-kube-api-access-f57c9\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:54:15.437964 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:15.437955 2581 reconciler_common.go:299] "Volume detached for volume \"cert\" (UniqueName: \"kubernetes.io/secret/b12daf06-e20e-4c18-b2f5-a567db50fa8f-cert\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:54:16.077573 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.077536 2581 generic.go:358] "Generic (PLEG): container finished" podID="b12daf06-e20e-4c18-b2f5-a567db50fa8f" containerID="956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557" exitCode=0 Apr 23 13:54:16.078060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.077599 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" Apr 23 13:54:16.078060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.077623 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" event={"ID":"b12daf06-e20e-4c18-b2f5-a567db50fa8f","Type":"ContainerDied","Data":"956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557"} Apr 23 13:54:16.078060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.077668 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8" event={"ID":"b12daf06-e20e-4c18-b2f5-a567db50fa8f","Type":"ContainerDied","Data":"d5bde48162dcd5d113a1b6a214451577df7b33c76858e759d1e0e8c9feab3c0d"} Apr 23 13:54:16.078060 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.077690 2581 scope.go:117] "RemoveContainer" containerID="956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557" Apr 23 13:54:16.090200 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.089192 2581 scope.go:117] "RemoveContainer" containerID="956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557" Apr 23 13:54:16.090564 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:54:16.090542 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557\": container with ID starting with 956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557 not found: ID does not exist" containerID="956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557" Apr 23 13:54:16.090628 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.090576 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557"} err="failed to get container status \"956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557\": rpc error: code = NotFound desc = could not find container \"956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557\": container with ID starting with 956d75b7245985be4627a3ac2727998f188a97c58232288e6649dfc1c5b0b557 not found: ID does not exist" Apr 23 13:54:16.099375 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.099342 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8"] Apr 23 13:54:16.102915 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.102886 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve/llmisvc-controller-manager-595fbbc8cc-tdjd8"] Apr 23 13:54:16.187927 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:54:16.187893 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b12daf06-e20e-4c18-b2f5-a567db50fa8f" path="/var/lib/kubelet/pods/b12daf06-e20e-4c18-b2f5-a567db50fa8f/volumes" Apr 23 13:55:58.264790 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:55:58.264764 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:55:58.270409 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:55:58.270387 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 13:56:10.772212 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:10.772108 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9"] Apr 23 13:56:10.772630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:10.772468 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="main" containerID="cri-o://30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4" gracePeriod=30 Apr 23 13:56:10.772630 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:10.772521 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="tokenizer" containerID="cri-o://73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd" gracePeriod=30 Apr 23 13:56:11.503751 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:11.503719 2581 generic.go:358] "Generic (PLEG): container finished" podID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerID="30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4" exitCode=0 Apr 23 13:56:11.503924 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:11.503792 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerDied","Data":"30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4"} Apr 23 13:56:14.965316 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:14.965269 2581 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="tokenizer" probeResult="failure" output="Get \"http://10.132.0.43:8082/healthz\": dial tcp 10.132.0.43:8082: connect: connection refused" Apr 23 13:56:15.824905 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.824876 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:56:15.928888 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.928801 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kserve-provision-location\") pod \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " Apr 23 13:56:15.928888 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.928874 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-tmp\") pod \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " Apr 23 13:56:15.929103 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.928904 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-cache\") pod \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " Apr 23 13:56:15.929103 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.928958 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-8pp54\" (UniqueName: \"kubernetes.io/projected/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kube-api-access-8pp54\") pod \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " Apr 23 13:56:15.929103 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.928992 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-uds\") pod \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " Apr 23 13:56:15.929103 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.929029 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tls-certs\") pod \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\" (UID: \"5db3d074-e1b7-4d3d-bdae-49e38aad0635\") " Apr 23 13:56:15.929346 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.929248 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "5db3d074-e1b7-4d3d-bdae-49e38aad0635" (UID: "5db3d074-e1b7-4d3d-bdae-49e38aad0635"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:15.929346 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.929259 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "5db3d074-e1b7-4d3d-bdae-49e38aad0635" (UID: "5db3d074-e1b7-4d3d-bdae-49e38aad0635"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:15.929457 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.929342 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "5db3d074-e1b7-4d3d-bdae-49e38aad0635" (UID: "5db3d074-e1b7-4d3d-bdae-49e38aad0635"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:15.929628 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.929604 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "5db3d074-e1b7-4d3d-bdae-49e38aad0635" (UID: "5db3d074-e1b7-4d3d-bdae-49e38aad0635"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:15.931465 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.931443 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "5db3d074-e1b7-4d3d-bdae-49e38aad0635" (UID: "5db3d074-e1b7-4d3d-bdae-49e38aad0635"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:56:15.931465 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:15.931449 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kube-api-access-8pp54" (OuterVolumeSpecName: "kube-api-access-8pp54") pod "5db3d074-e1b7-4d3d-bdae-49e38aad0635" (UID: "5db3d074-e1b7-4d3d-bdae-49e38aad0635"). InnerVolumeSpecName "kube-api-access-8pp54". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:56:16.029858 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.029823 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-8pp54\" (UniqueName: \"kubernetes.io/projected/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kube-api-access-8pp54\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:56:16.029858 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.029854 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:56:16.029858 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.029865 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:56:16.030326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.029873 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:56:16.030326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.029882 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:56:16.030326 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.029891 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/5db3d074-e1b7-4d3d-bdae-49e38aad0635-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:56:16.523738 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.523658 2581 generic.go:358] "Generic (PLEG): container finished" podID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerID="73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd" exitCode=0 Apr 23 13:56:16.523738 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.523727 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerDied","Data":"73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd"} Apr 23 13:56:16.523738 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.523733 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" Apr 23 13:56:16.523961 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.523754 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9" event={"ID":"5db3d074-e1b7-4d3d-bdae-49e38aad0635","Type":"ContainerDied","Data":"b09840eb8312c193369f0a8a0e733ef9bf708291e2573c1a9dfb5ba92dcdebed"} Apr 23 13:56:16.523961 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.523769 2581 scope.go:117] "RemoveContainer" containerID="73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd" Apr 23 13:56:16.532266 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.532241 2581 scope.go:117] "RemoveContainer" containerID="30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4" Apr 23 13:56:16.539672 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.539654 2581 scope.go:117] "RemoveContainer" containerID="74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348" Apr 23 13:56:16.544267 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.544209 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9"] Apr 23 13:56:16.547480 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.547443 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-sche29fz9"] Apr 23 13:56:16.548653 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.548634 2581 scope.go:117] "RemoveContainer" containerID="73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd" Apr 23 13:56:16.548979 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:56:16.548956 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd\": container with ID starting with 73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd not found: ID does not exist" containerID="73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd" Apr 23 13:56:16.549035 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.548990 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd"} err="failed to get container status \"73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd\": rpc error: code = NotFound desc = could not find container \"73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd\": container with ID starting with 73fdebe1d1ba8e9d18a2599b123d93c3a8212148270cb7835a9495a7249a8fdd not found: ID does not exist" Apr 23 13:56:16.549035 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.549010 2581 scope.go:117] "RemoveContainer" containerID="30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4" Apr 23 13:56:16.549303 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:56:16.549282 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4\": container with ID starting with 30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4 not found: ID does not exist" containerID="30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4" Apr 23 13:56:16.549365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.549309 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4"} err="failed to get container status \"30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4\": rpc error: code = NotFound desc = could not find container \"30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4\": container with ID starting with 30630fa4ed1473f609cbe49a2dd828f9e4d95c051733b0d4323b026c43ed58c4 not found: ID does not exist" Apr 23 13:56:16.549365 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.549328 2581 scope.go:117] "RemoveContainer" containerID="74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348" Apr 23 13:56:16.549573 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:56:16.549545 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348\": container with ID starting with 74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348 not found: ID does not exist" containerID="74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348" Apr 23 13:56:16.549625 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:16.549570 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348"} err="failed to get container status \"74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348\": rpc error: code = NotFound desc = could not find container \"74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348\": container with ID starting with 74e261ac7af105cccbaae01bd1674d857f22e4265bfee46c4ee2516f00de6348 not found: ID does not exist" Apr 23 13:56:18.189397 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:18.189364 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" path="/var/lib/kubelet/pods/5db3d074-e1b7-4d3d-bdae-49e38aad0635/volumes" Apr 23 13:56:28.158212 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158177 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7"] Apr 23 13:56:28.158744 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158690 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="tokenizer" Apr 23 13:56:28.158744 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158706 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="tokenizer" Apr 23 13:56:28.158744 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158720 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="main" Apr 23 13:56:28.158744 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158729 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="main" Apr 23 13:56:28.158744 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158741 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b12daf06-e20e-4c18-b2f5-a567db50fa8f" containerName="manager" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158751 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="b12daf06-e20e-4c18-b2f5-a567db50fa8f" containerName="manager" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158771 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="storage-initializer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158778 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="storage-initializer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158787 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="tokenizer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158792 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="tokenizer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158799 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="main" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158804 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="main" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158809 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="storage-initializer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158815 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="storage-initializer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158893 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="main" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158905 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="5db3d074-e1b7-4d3d-bdae-49e38aad0635" containerName="tokenizer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158914 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="main" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158921 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="8468f591-62c9-4d0b-85da-66f8dc5ff968" containerName="tokenizer" Apr 23 13:56:28.158926 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.158928 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="b12daf06-e20e-4c18-b2f5-a567db50fa8f" containerName="manager" Apr 23 13:56:28.162636 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.162616 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.165117 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.165094 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:56:28.165263 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.165208 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:56:28.165315 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.165218 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-mh6m2\"" Apr 23 13:56:28.166120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.166097 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:56:28.166120 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.166119 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 23 13:56:28.172678 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.172656 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7"] Apr 23 13:56:28.231113 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.231074 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.231316 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.231122 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.231316 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.231242 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.231316 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.231302 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-z6vqv\" (UniqueName: \"kubernetes.io/projected/00cd93f0-d418-4585-a830-4218d8e48c09-kube-api-access-z6vqv\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.231427 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.231362 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.231427 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.231388 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00cd93f0-d418-4585-a830-4218d8e48c09-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332023 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.331980 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-z6vqv\" (UniqueName: \"kubernetes.io/projected/00cd93f0-d418-4585-a830-4218d8e48c09-kube-api-access-z6vqv\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332246 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332059 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332246 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332080 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00cd93f0-d418-4585-a830-4218d8e48c09-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332246 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332118 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332413 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332399 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332474 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332428 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332577 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332553 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332640 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332611 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332689 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332678 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.332726 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.332709 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.334777 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.334759 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00cd93f0-d418-4585-a830-4218d8e48c09-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.342095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.342071 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-z6vqv\" (UniqueName: \"kubernetes.io/projected/00cd93f0-d418-4585-a830-4218d8e48c09-kube-api-access-z6vqv\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.473051 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.472950 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:28.607079 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.607014 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7"] Apr 23 13:56:28.609414 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:56:28.609384 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00cd93f0_d418_4585_a830_4218d8e48c09.slice/crio-de6c6f12dd78b53257355ddd6e84b265a66a34199933b4d0165369f811220b68 WatchSource:0}: Error finding container de6c6f12dd78b53257355ddd6e84b265a66a34199933b4d0165369f811220b68: Status 404 returned error can't find the container with id de6c6f12dd78b53257355ddd6e84b265a66a34199933b4d0165369f811220b68 Apr 23 13:56:28.611232 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:28.611214 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:56:29.571998 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:29.571962 2581 generic.go:358] "Generic (PLEG): container finished" podID="00cd93f0-d418-4585-a830-4218d8e48c09" containerID="6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8" exitCode=0 Apr 23 13:56:29.572411 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:29.572061 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerDied","Data":"6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8"} Apr 23 13:56:29.572411 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:29.572101 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerStarted","Data":"de6c6f12dd78b53257355ddd6e84b265a66a34199933b4d0165369f811220b68"} Apr 23 13:56:30.577415 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:30.577381 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerStarted","Data":"8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38"} Apr 23 13:56:30.577415 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:30.577417 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerStarted","Data":"73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0"} Apr 23 13:56:30.577884 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:30.577522 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:30.602731 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:30.602665 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" podStartSLOduration=2.602645235 podStartE2EDuration="2.602645235s" podCreationTimestamp="2026-04-23 13:56:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:56:30.600971361 +0000 UTC m=+1532.872900713" watchObservedRunningTime="2026-04-23 13:56:30.602645235 +0000 UTC m=+1532.874574569" Apr 23 13:56:38.473562 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:38.473508 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:38.473562 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:38.473578 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:38.476388 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:38.476364 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:38.611214 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:38.611127 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:56:53.869649 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.869612 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx"] Apr 23 13:56:53.873396 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.873375 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:53.876237 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.876212 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-zdpj9\"" Apr 23 13:56:53.876432 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.876419 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 23 13:56:53.883740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.883714 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx"] Apr 23 13:56:53.949639 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.949597 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:53.949639 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.949641 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l9cdz\" (UniqueName: \"kubernetes.io/projected/beafbb76-fe63-4b40-a769-5c32bffa1675-kube-api-access-l9cdz\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:53.949867 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.949667 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:53.949867 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.949713 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:53.949867 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.949793 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/beafbb76-fe63-4b40-a769-5c32bffa1675-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:53.949867 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:53.949821 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.050852 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.050815 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/beafbb76-fe63-4b40-a769-5c32bffa1675-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.050852 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.050855 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.050947 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.050980 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l9cdz\" (UniqueName: \"kubernetes.io/projected/beafbb76-fe63-4b40-a769-5c32bffa1675-kube-api-access-l9cdz\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.051021 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051095 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.051047 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051477 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.051448 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051578 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.051463 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051578 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.051513 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.051578 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.051524 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.053654 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.053624 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/beafbb76-fe63-4b40-a769-5c32bffa1675-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.059411 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.059387 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l9cdz\" (UniqueName: \"kubernetes.io/projected/beafbb76-fe63-4b40-a769-5c32bffa1675-kube-api-access-l9cdz\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.184901 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.184812 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:54.317544 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.317503 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx"] Apr 23 13:56:54.320431 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:56:54.320399 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbeafbb76_fe63_4b40_a769_5c32bffa1675.slice/crio-1ff943b7b72c272755a9fff1e3e61c4a45ab91114177d83b88f089c84169e51d WatchSource:0}: Error finding container 1ff943b7b72c272755a9fff1e3e61c4a45ab91114177d83b88f089c84169e51d: Status 404 returned error can't find the container with id 1ff943b7b72c272755a9fff1e3e61c4a45ab91114177d83b88f089c84169e51d Apr 23 13:56:54.666013 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.665976 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerStarted","Data":"441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b"} Apr 23 13:56:54.666013 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:54.666017 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerStarted","Data":"1ff943b7b72c272755a9fff1e3e61c4a45ab91114177d83b88f089c84169e51d"} Apr 23 13:56:55.670427 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:55.670393 2581 generic.go:358] "Generic (PLEG): container finished" podID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerID="441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b" exitCode=0 Apr 23 13:56:55.670817 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:55.670485 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerDied","Data":"441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b"} Apr 23 13:56:56.675499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:56.675465 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerStarted","Data":"2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708"} Apr 23 13:56:56.675499 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:56.675501 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerStarted","Data":"d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5"} Apr 23 13:56:56.675916 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:56.675530 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:56:56.711165 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:56.711093 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" podStartSLOduration=3.71107806 podStartE2EDuration="3.71107806s" podCreationTimestamp="2026-04-23 13:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:56:56.708253399 +0000 UTC m=+1558.980182734" watchObservedRunningTime="2026-04-23 13:56:56.71107806 +0000 UTC m=+1558.983007394" Apr 23 13:56:59.614895 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:56:59.614860 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:57:04.188907 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:04.188879 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:57:04.189286 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:04.189010 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:57:04.189286 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:04.189021 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:57:04.190217 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:04.190198 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:57:25.711595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:25.711566 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:57:58.319582 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:58.319552 2581 scope.go:117] "RemoveContainer" containerID="fd5a284e23ed05c9dbeeba7bde9560c014540fa175e2c7ad8d8b173a0a6f10cf" Apr 23 13:57:58.328104 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:58.328084 2581 scope.go:117] "RemoveContainer" containerID="a101b74ad05a0d597162ba369e0f3d3d309ab5a90f113b9a397671c1c3fc75c1" Apr 23 13:57:58.335891 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:57:58.335865 2581 scope.go:117] "RemoveContainer" containerID="f39523cc328198270c7774dfa091eb5420f5c48ff398605dfb82e5a6937e03e9" Apr 23 13:59:25.642596 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:25.642490 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7"] Apr 23 13:59:25.643198 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:25.642957 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="main" containerID="cri-o://73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0" gracePeriod=30 Apr 23 13:59:25.643198 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:25.643023 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="tokenizer" containerID="cri-o://8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38" gracePeriod=30 Apr 23 13:59:26.216608 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:26.216573 2581 generic.go:358] "Generic (PLEG): container finished" podID="00cd93f0-d418-4585-a830-4218d8e48c09" containerID="73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0" exitCode=0 Apr 23 13:59:26.216795 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:26.216651 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerDied","Data":"73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0"} Apr 23 13:59:26.997263 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:26.997239 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:59:27.138167 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138127 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00cd93f0-d418-4585-a830-4218d8e48c09-tls-certs\") pod \"00cd93f0-d418-4585-a830-4218d8e48c09\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " Apr 23 13:59:27.138339 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138219 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-uds\") pod \"00cd93f0-d418-4585-a830-4218d8e48c09\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " Apr 23 13:59:27.138404 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138382 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-z6vqv\" (UniqueName: \"kubernetes.io/projected/00cd93f0-d418-4585-a830-4218d8e48c09-kube-api-access-z6vqv\") pod \"00cd93f0-d418-4585-a830-4218d8e48c09\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " Apr 23 13:59:27.138451 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138425 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "00cd93f0-d418-4585-a830-4218d8e48c09" (UID: "00cd93f0-d418-4585-a830-4218d8e48c09"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:27.138451 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138437 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-cache\") pod \"00cd93f0-d418-4585-a830-4218d8e48c09\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " Apr 23 13:59:27.138575 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138550 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-kserve-provision-location\") pod \"00cd93f0-d418-4585-a830-4218d8e48c09\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " Apr 23 13:59:27.138688 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138580 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "00cd93f0-d418-4585-a830-4218d8e48c09" (UID: "00cd93f0-d418-4585-a830-4218d8e48c09"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:27.138688 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138591 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-tmp\") pod \"00cd93f0-d418-4585-a830-4218d8e48c09\" (UID: \"00cd93f0-d418-4585-a830-4218d8e48c09\") " Apr 23 13:59:27.138853 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138837 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:27.138921 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138855 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:27.138992 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.138966 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "00cd93f0-d418-4585-a830-4218d8e48c09" (UID: "00cd93f0-d418-4585-a830-4218d8e48c09"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:27.139550 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.139524 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "00cd93f0-d418-4585-a830-4218d8e48c09" (UID: "00cd93f0-d418-4585-a830-4218d8e48c09"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:27.140469 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.140439 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00cd93f0-d418-4585-a830-4218d8e48c09-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "00cd93f0-d418-4585-a830-4218d8e48c09" (UID: "00cd93f0-d418-4585-a830-4218d8e48c09"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:27.140687 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.140667 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00cd93f0-d418-4585-a830-4218d8e48c09-kube-api-access-z6vqv" (OuterVolumeSpecName: "kube-api-access-z6vqv") pod "00cd93f0-d418-4585-a830-4218d8e48c09" (UID: "00cd93f0-d418-4585-a830-4218d8e48c09"). InnerVolumeSpecName "kube-api-access-z6vqv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:27.222532 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.222495 2581 generic.go:358] "Generic (PLEG): container finished" podID="00cd93f0-d418-4585-a830-4218d8e48c09" containerID="8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38" exitCode=0 Apr 23 13:59:27.222740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.222578 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" Apr 23 13:59:27.222740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.222579 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerDied","Data":"8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38"} Apr 23 13:59:27.222740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.222619 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7" event={"ID":"00cd93f0-d418-4585-a830-4218d8e48c09","Type":"ContainerDied","Data":"de6c6f12dd78b53257355ddd6e84b265a66a34199933b4d0165369f811220b68"} Apr 23 13:59:27.222740 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.222639 2581 scope.go:117] "RemoveContainer" containerID="8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38" Apr 23 13:59:27.232184 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.232167 2581 scope.go:117] "RemoveContainer" containerID="73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0" Apr 23 13:59:27.239897 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.239869 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-z6vqv\" (UniqueName: \"kubernetes.io/projected/00cd93f0-d418-4585-a830-4218d8e48c09-kube-api-access-z6vqv\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:27.240027 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.239901 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:27.240027 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.239917 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/00cd93f0-d418-4585-a830-4218d8e48c09-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:27.240027 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.239931 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00cd93f0-d418-4585-a830-4218d8e48c09-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:27.241687 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.241665 2581 scope.go:117] "RemoveContainer" containerID="6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8" Apr 23 13:59:27.245415 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.245388 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7"] Apr 23 13:59:27.249333 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.249311 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-74864n5wx7"] Apr 23 13:59:27.250880 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.250852 2581 scope.go:117] "RemoveContainer" containerID="8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38" Apr 23 13:59:27.251134 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:59:27.251118 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38\": container with ID starting with 8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38 not found: ID does not exist" containerID="8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38" Apr 23 13:59:27.251211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.251142 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38"} err="failed to get container status \"8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38\": rpc error: code = NotFound desc = could not find container \"8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38\": container with ID starting with 8b400b861e452ebd2ff11645a7e046fe3437202c892c75cac9367950b9773d38 not found: ID does not exist" Apr 23 13:59:27.251211 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.251178 2581 scope.go:117] "RemoveContainer" containerID="73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0" Apr 23 13:59:27.251383 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:59:27.251365 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0\": container with ID starting with 73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0 not found: ID does not exist" containerID="73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0" Apr 23 13:59:27.251453 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.251394 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0"} err="failed to get container status \"73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0\": rpc error: code = NotFound desc = could not find container \"73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0\": container with ID starting with 73c71b373c1a20acd652512e7fde268f1131fa2294c02c2d71b15a1e70f3acd0 not found: ID does not exist" Apr 23 13:59:27.251453 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.251417 2581 scope.go:117] "RemoveContainer" containerID="6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8" Apr 23 13:59:27.251635 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:59:27.251620 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8\": container with ID starting with 6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8 not found: ID does not exist" containerID="6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8" Apr 23 13:59:27.251675 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:27.251639 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8"} err="failed to get container status \"6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8\": rpc error: code = NotFound desc = could not find container \"6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8\": container with ID starting with 6630e393d49315c4c2f0111e1f24748575a4e1c700606c0eb43e99806b676df8 not found: ID does not exist" Apr 23 13:59:28.188789 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:28.188758 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" path="/var/lib/kubelet/pods/00cd93f0-d418-4585-a830-4218d8e48c09/volumes" Apr 23 13:59:43.288652 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.288613 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2"] Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.288978 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="tokenizer" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.288990 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="tokenizer" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.289007 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="main" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.289014 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="main" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.289027 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="storage-initializer" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.289033 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="storage-initializer" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.289082 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="tokenizer" Apr 23 13:59:43.289116 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.289095 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="00cd93f0-d418-4585-a830-4218d8e48c09" containerName="main" Apr 23 13:59:43.294391 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.294366 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.297175 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.297136 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-jgr8c\"" Apr 23 13:59:43.297546 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.297527 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 23 13:59:43.318359 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.318329 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2"] Apr 23 13:59:43.381530 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.381496 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.381704 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.381540 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.381704 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.381561 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.381704 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.381612 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.381704 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.381638 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqhvv\" (UniqueName: \"kubernetes.io/projected/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kube-api-access-kqhvv\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.381857 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.381739 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.482547 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.482511 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.482770 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.482574 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.482770 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.482609 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.482770 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.482636 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.482770 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.482674 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.482770 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.482706 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqhvv\" (UniqueName: \"kubernetes.io/projected/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kube-api-access-kqhvv\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.483072 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.483046 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.483133 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.483106 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.483215 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.483133 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.483254 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.483210 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.485243 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.485224 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.494722 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.494694 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqhvv\" (UniqueName: \"kubernetes.io/projected/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kube-api-access-kqhvv\") pod \"router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.604972 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.604940 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:43.735233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:43.735199 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2"] Apr 23 13:59:43.737481 ip-10-0-133-33 kubenswrapper[2581]: W0423 13:59:43.737455 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod4b8c8c41_b03f_4107_bc5a_042ccfb9df88.slice/crio-299a2a110b9ce5276f73aca20956825d9e25b403777c91c33d491b466be93709 WatchSource:0}: Error finding container 299a2a110b9ce5276f73aca20956825d9e25b403777c91c33d491b466be93709: Status 404 returned error can't find the container with id 299a2a110b9ce5276f73aca20956825d9e25b403777c91c33d491b466be93709 Apr 23 13:59:44.284676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:44.284641 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerStarted","Data":"0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3"} Apr 23 13:59:44.284676 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:44.284678 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerStarted","Data":"299a2a110b9ce5276f73aca20956825d9e25b403777c91c33d491b466be93709"} Apr 23 13:59:45.289479 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:45.289429 2581 generic.go:358] "Generic (PLEG): container finished" podID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerID="0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3" exitCode=0 Apr 23 13:59:45.289479 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:45.289478 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerDied","Data":"0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3"} Apr 23 13:59:46.295720 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:46.295675 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerStarted","Data":"643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321"} Apr 23 13:59:46.296139 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:46.295728 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerStarted","Data":"661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f"} Apr 23 13:59:46.296139 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:46.295888 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:46.317251 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:46.317190 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" podStartSLOduration=3.317167984 podStartE2EDuration="3.317167984s" podCreationTimestamp="2026-04-23 13:59:43 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:59:46.315513131 +0000 UTC m=+1728.587442488" watchObservedRunningTime="2026-04-23 13:59:46.317167984 +0000 UTC m=+1728.589097311" Apr 23 13:59:51.838011 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:51.837969 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx"] Apr 23 13:59:51.838484 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:51.838417 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="main" containerID="cri-o://d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5" gracePeriod=30 Apr 23 13:59:51.838554 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:51.838509 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="tokenizer" containerID="cri-o://2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708" gracePeriod=30 Apr 23 13:59:52.322438 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:52.322401 2581 generic.go:358] "Generic (PLEG): container finished" podID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerID="d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5" exitCode=0 Apr 23 13:59:52.322612 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:52.322435 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerDied","Data":"d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5"} Apr 23 13:59:53.193112 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.193087 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:59:53.277200 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277085 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l9cdz\" (UniqueName: \"kubernetes.io/projected/beafbb76-fe63-4b40-a769-5c32bffa1675-kube-api-access-l9cdz\") pod \"beafbb76-fe63-4b40-a769-5c32bffa1675\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " Apr 23 13:59:53.277363 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277209 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-uds\") pod \"beafbb76-fe63-4b40-a769-5c32bffa1675\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " Apr 23 13:59:53.277363 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277243 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-tmp\") pod \"beafbb76-fe63-4b40-a769-5c32bffa1675\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " Apr 23 13:59:53.277363 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277261 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-cache\") pod \"beafbb76-fe63-4b40-a769-5c32bffa1675\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " Apr 23 13:59:53.277363 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277283 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-kserve-provision-location\") pod \"beafbb76-fe63-4b40-a769-5c32bffa1675\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " Apr 23 13:59:53.277363 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277313 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/beafbb76-fe63-4b40-a769-5c32bffa1675-tls-certs\") pod \"beafbb76-fe63-4b40-a769-5c32bffa1675\" (UID: \"beafbb76-fe63-4b40-a769-5c32bffa1675\") " Apr 23 13:59:53.277624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277550 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "beafbb76-fe63-4b40-a769-5c32bffa1675" (UID: "beafbb76-fe63-4b40-a769-5c32bffa1675"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:53.277624 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277565 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "beafbb76-fe63-4b40-a769-5c32bffa1675" (UID: "beafbb76-fe63-4b40-a769-5c32bffa1675"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:53.277738 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277627 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "beafbb76-fe63-4b40-a769-5c32bffa1675" (UID: "beafbb76-fe63-4b40-a769-5c32bffa1675"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:53.277985 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.277962 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "beafbb76-fe63-4b40-a769-5c32bffa1675" (UID: "beafbb76-fe63-4b40-a769-5c32bffa1675"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:53.279476 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.279443 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/beafbb76-fe63-4b40-a769-5c32bffa1675-kube-api-access-l9cdz" (OuterVolumeSpecName: "kube-api-access-l9cdz") pod "beafbb76-fe63-4b40-a769-5c32bffa1675" (UID: "beafbb76-fe63-4b40-a769-5c32bffa1675"). InnerVolumeSpecName "kube-api-access-l9cdz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:53.279574 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.279526 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/beafbb76-fe63-4b40-a769-5c32bffa1675-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "beafbb76-fe63-4b40-a769-5c32bffa1675" (UID: "beafbb76-fe63-4b40-a769-5c32bffa1675"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:53.327747 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.327705 2581 generic.go:358] "Generic (PLEG): container finished" podID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerID="2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708" exitCode=0 Apr 23 13:59:53.327909 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.327786 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" Apr 23 13:59:53.327909 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.327786 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerDied","Data":"2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708"} Apr 23 13:59:53.327909 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.327828 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx" event={"ID":"beafbb76-fe63-4b40-a769-5c32bffa1675","Type":"ContainerDied","Data":"1ff943b7b72c272755a9fff1e3e61c4a45ab91114177d83b88f089c84169e51d"} Apr 23 13:59:53.327909 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.327844 2581 scope.go:117] "RemoveContainer" containerID="2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708" Apr 23 13:59:53.336986 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.336967 2581 scope.go:117] "RemoveContainer" containerID="d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5" Apr 23 13:59:53.345313 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.344929 2581 scope.go:117] "RemoveContainer" containerID="441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b" Apr 23 13:59:53.349928 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.349906 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx"] Apr 23 13:59:53.355764 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.355733 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schesfsgx"] Apr 23 13:59:53.356431 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.356412 2581 scope.go:117] "RemoveContainer" containerID="2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708" Apr 23 13:59:53.356743 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:59:53.356726 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708\": container with ID starting with 2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708 not found: ID does not exist" containerID="2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708" Apr 23 13:59:53.356805 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.356752 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708"} err="failed to get container status \"2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708\": rpc error: code = NotFound desc = could not find container \"2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708\": container with ID starting with 2530ed7fb7806ccd357cd06c8482844914649a0265bdf35a6f8b6ad681f13708 not found: ID does not exist" Apr 23 13:59:53.356805 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.356771 2581 scope.go:117] "RemoveContainer" containerID="d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5" Apr 23 13:59:53.357018 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:59:53.356995 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5\": container with ID starting with d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5 not found: ID does not exist" containerID="d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5" Apr 23 13:59:53.357062 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.357026 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5"} err="failed to get container status \"d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5\": rpc error: code = NotFound desc = could not find container \"d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5\": container with ID starting with d59f5e2b62082059f3c41ea63d8d4b76ef1f8968c36e7ac5225c30c4cec798d5 not found: ID does not exist" Apr 23 13:59:53.357062 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.357042 2581 scope.go:117] "RemoveContainer" containerID="441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b" Apr 23 13:59:53.357271 ip-10-0-133-33 kubenswrapper[2581]: E0423 13:59:53.357253 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b\": container with ID starting with 441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b not found: ID does not exist" containerID="441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b" Apr 23 13:59:53.357321 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.357278 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b"} err="failed to get container status \"441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b\": rpc error: code = NotFound desc = could not find container \"441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b\": container with ID starting with 441cc60af4c9f840eca977cb6bbc42f38c642d2921669b42f843e7a2708ce98b not found: ID does not exist" Apr 23 13:59:53.378414 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.378379 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:53.378414 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.378418 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:53.378595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.378436 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:53.378595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.378451 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/beafbb76-fe63-4b40-a769-5c32bffa1675-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:53.378595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.378466 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/beafbb76-fe63-4b40-a769-5c32bffa1675-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:53.378595 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.378482 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l9cdz\" (UniqueName: \"kubernetes.io/projected/beafbb76-fe63-4b40-a769-5c32bffa1675-kube-api-access-l9cdz\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 13:59:53.605233 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.605195 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:53.605433 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.605245 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:53.608139 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:53.608112 2581 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 13:59:54.189277 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:54.189243 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" path="/var/lib/kubelet/pods/beafbb76-fe63-4b40-a769-5c32bffa1675/volumes" Apr 23 13:59:54.333935 ip-10-0-133-33 kubenswrapper[2581]: I0423 13:59:54.333899 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 14:00:16.341802 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:00:16.341774 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 14:00:58.293881 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:00:58.293852 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 14:00:58.299145 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:00:58.299124 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 14:02:14.695342 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:14.695306 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2"] Apr 23 14:02:14.695889 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:14.695793 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="main" containerID="cri-o://661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f" gracePeriod=30 Apr 23 14:02:14.695889 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:14.695826 2581 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="tokenizer" containerID="cri-o://643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321" gracePeriod=30 Apr 23 14:02:14.819762 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:14.819731 2581 generic.go:358] "Generic (PLEG): container finished" podID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerID="661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f" exitCode=0 Apr 23 14:02:14.819925 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:14.819813 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerDied","Data":"661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f"} Apr 23 14:02:16.051297 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.051271 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 14:02:16.179379 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179340 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqhvv\" (UniqueName: \"kubernetes.io/projected/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kube-api-access-kqhvv\") pod \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " Apr 23 14:02:16.179593 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179392 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-uds\") pod \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " Apr 23 14:02:16.179593 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179419 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-cache\") pod \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " Apr 23 14:02:16.179593 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179459 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kserve-provision-location\") pod \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " Apr 23 14:02:16.179593 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179510 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-tmp\") pod \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " Apr 23 14:02:16.179593 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179552 2581 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tls-certs\") pod \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\" (UID: \"4b8c8c41-b03f-4107-bc5a-042ccfb9df88\") " Apr 23 14:02:16.179864 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179670 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "4b8c8c41-b03f-4107-bc5a-042ccfb9df88" (UID: "4b8c8c41-b03f-4107-bc5a-042ccfb9df88"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:16.179864 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179683 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "4b8c8c41-b03f-4107-bc5a-042ccfb9df88" (UID: "4b8c8c41-b03f-4107-bc5a-042ccfb9df88"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:16.179864 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179784 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-uds\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 14:02:16.179864 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179803 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-cache\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 14:02:16.180057 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.179954 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "4b8c8c41-b03f-4107-bc5a-042ccfb9df88" (UID: "4b8c8c41-b03f-4107-bc5a-042ccfb9df88"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:16.180414 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.180391 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "4b8c8c41-b03f-4107-bc5a-042ccfb9df88" (UID: "4b8c8c41-b03f-4107-bc5a-042ccfb9df88"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:16.181826 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.181804 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kube-api-access-kqhvv" (OuterVolumeSpecName: "kube-api-access-kqhvv") pod "4b8c8c41-b03f-4107-bc5a-042ccfb9df88" (UID: "4b8c8c41-b03f-4107-bc5a-042ccfb9df88"). InnerVolumeSpecName "kube-api-access-kqhvv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:02:16.181889 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.181829 2581 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "4b8c8c41-b03f-4107-bc5a-042ccfb9df88" (UID: "4b8c8c41-b03f-4107-bc5a-042ccfb9df88"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 14:02:16.280812 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.280773 2581 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tokenizer-tmp\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 14:02:16.280812 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.280812 2581 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-tls-certs\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 14:02:16.281045 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.280827 2581 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqhvv\" (UniqueName: \"kubernetes.io/projected/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kube-api-access-kqhvv\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 14:02:16.281045 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.280842 2581 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/4b8c8c41-b03f-4107-bc5a-042ccfb9df88-kserve-provision-location\") on node \"ip-10-0-133-33.ec2.internal\" DevicePath \"\"" Apr 23 14:02:16.829428 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.829394 2581 generic.go:358] "Generic (PLEG): container finished" podID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerID="643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321" exitCode=0 Apr 23 14:02:16.829428 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.829432 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerDied","Data":"643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321"} Apr 23 14:02:16.829644 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.829454 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" event={"ID":"4b8c8c41-b03f-4107-bc5a-042ccfb9df88","Type":"ContainerDied","Data":"299a2a110b9ce5276f73aca20956825d9e25b403777c91c33d491b466be93709"} Apr 23 14:02:16.829644 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.829465 2581 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2" Apr 23 14:02:16.829644 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.829469 2581 scope.go:117] "RemoveContainer" containerID="643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321" Apr 23 14:02:16.838124 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.838103 2581 scope.go:117] "RemoveContainer" containerID="661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f" Apr 23 14:02:16.845872 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.845850 2581 scope.go:117] "RemoveContainer" containerID="0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3" Apr 23 14:02:16.849028 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.849002 2581 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2"] Apr 23 14:02:16.852667 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.852643 2581 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-7fbdf5498vcgd2"] Apr 23 14:02:16.855167 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.855127 2581 scope.go:117] "RemoveContainer" containerID="643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321" Apr 23 14:02:16.855473 ip-10-0-133-33 kubenswrapper[2581]: E0423 14:02:16.855453 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321\": container with ID starting with 643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321 not found: ID does not exist" containerID="643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321" Apr 23 14:02:16.855543 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.855482 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321"} err="failed to get container status \"643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321\": rpc error: code = NotFound desc = could not find container \"643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321\": container with ID starting with 643f6e5007accb8d2c5725d03f08d266f6aeff4985e33b54a6b5fa8cfbf0a321 not found: ID does not exist" Apr 23 14:02:16.855543 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.855501 2581 scope.go:117] "RemoveContainer" containerID="661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f" Apr 23 14:02:16.855745 ip-10-0-133-33 kubenswrapper[2581]: E0423 14:02:16.855724 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f\": container with ID starting with 661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f not found: ID does not exist" containerID="661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f" Apr 23 14:02:16.855805 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.855754 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f"} err="failed to get container status \"661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f\": rpc error: code = NotFound desc = could not find container \"661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f\": container with ID starting with 661f36269aba63c40ccf22fcbba9d0d842c664c105cd00452f08266ca022f67f not found: ID does not exist" Apr 23 14:02:16.855805 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.855773 2581 scope.go:117] "RemoveContainer" containerID="0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3" Apr 23 14:02:16.856048 ip-10-0-133-33 kubenswrapper[2581]: E0423 14:02:16.856020 2581 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3\": container with ID starting with 0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3 not found: ID does not exist" containerID="0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3" Apr 23 14:02:16.856048 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:16.856041 2581 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3"} err="failed to get container status \"0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3\": rpc error: code = NotFound desc = could not find container \"0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3\": container with ID starting with 0b46b9dac8498bddec69487a15ef0b2c4686d9d451b3bf8976d7c96f22cd42e3 not found: ID does not exist" Apr 23 14:02:18.188563 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:18.188529 2581 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" path="/var/lib/kubelet/pods/4b8c8c41-b03f-4107-bc5a-042ccfb9df88/volumes" Apr 23 14:02:46.180601 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:46.180575 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-tkjgm_45563efb-424c-493a-b947-946985c787f6/manager/0.log" Apr 23 14:02:48.537238 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537204 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s9qb/must-gather-wlw7j"] Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537597 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="main" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537610 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="main" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537620 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="tokenizer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537625 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="tokenizer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537637 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="storage-initializer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537642 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="storage-initializer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537648 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="main" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537653 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="main" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537660 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="tokenizer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537668 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="tokenizer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537682 2581 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="storage-initializer" Apr 23 14:02:48.537706 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537688 2581 state_mem.go:107] "Deleted CPUSet assignment" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="storage-initializer" Apr 23 14:02:48.538063 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537760 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="main" Apr 23 14:02:48.538063 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537774 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="main" Apr 23 14:02:48.538063 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537779 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="beafbb76-fe63-4b40-a769-5c32bffa1675" containerName="tokenizer" Apr 23 14:02:48.538063 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.537787 2581 memory_manager.go:356] "RemoveStaleState removing state" podUID="4b8c8c41-b03f-4107-bc5a-042ccfb9df88" containerName="tokenizer" Apr 23 14:02:48.540767 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.540745 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.544229 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.544207 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s9qb\"/\"kube-root-ca.crt\"" Apr 23 14:02:48.544972 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.544952 2581 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s9qb\"/\"openshift-service-ca.crt\"" Apr 23 14:02:48.545257 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.545232 2581 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7s9qb\"/\"default-dockercfg-b5cf6\"" Apr 23 14:02:48.558413 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.558378 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/must-gather-wlw7j"] Apr 23 14:02:48.659556 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.659521 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nwxbz\" (UniqueName: \"kubernetes.io/projected/34b5800b-5cf3-4de0-ba61-1fbd49181b88-kube-api-access-nwxbz\") pod \"must-gather-wlw7j\" (UID: \"34b5800b-5cf3-4de0-ba61-1fbd49181b88\") " pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.659556 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.659566 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/34b5800b-5cf3-4de0-ba61-1fbd49181b88-must-gather-output\") pod \"must-gather-wlw7j\" (UID: \"34b5800b-5cf3-4de0-ba61-1fbd49181b88\") " pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.760538 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.760498 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nwxbz\" (UniqueName: \"kubernetes.io/projected/34b5800b-5cf3-4de0-ba61-1fbd49181b88-kube-api-access-nwxbz\") pod \"must-gather-wlw7j\" (UID: \"34b5800b-5cf3-4de0-ba61-1fbd49181b88\") " pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.760538 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.760545 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/34b5800b-5cf3-4de0-ba61-1fbd49181b88-must-gather-output\") pod \"must-gather-wlw7j\" (UID: \"34b5800b-5cf3-4de0-ba61-1fbd49181b88\") " pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.760853 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.760835 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/34b5800b-5cf3-4de0-ba61-1fbd49181b88-must-gather-output\") pod \"must-gather-wlw7j\" (UID: \"34b5800b-5cf3-4de0-ba61-1fbd49181b88\") " pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.770129 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.770106 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nwxbz\" (UniqueName: \"kubernetes.io/projected/34b5800b-5cf3-4de0-ba61-1fbd49181b88-kube-api-access-nwxbz\") pod \"must-gather-wlw7j\" (UID: \"34b5800b-5cf3-4de0-ba61-1fbd49181b88\") " pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.850431 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.850348 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/must-gather-wlw7j" Apr 23 14:02:48.981936 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.981911 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/must-gather-wlw7j"] Apr 23 14:02:48.984342 ip-10-0-133-33 kubenswrapper[2581]: W0423 14:02:48.984315 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod34b5800b_5cf3_4de0_ba61_1fbd49181b88.slice/crio-736de8c2cc87d705660f8ee268abc469c7d175c262a15cc737b388186cd075d3 WatchSource:0}: Error finding container 736de8c2cc87d705660f8ee268abc469c7d175c262a15cc737b388186cd075d3: Status 404 returned error can't find the container with id 736de8c2cc87d705660f8ee268abc469c7d175c262a15cc737b388186cd075d3 Apr 23 14:02:48.986085 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:48.986069 2581 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:02:49.962431 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:49.962383 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/must-gather-wlw7j" event={"ID":"34b5800b-5cf3-4de0-ba61-1fbd49181b88","Type":"ContainerStarted","Data":"2803a41ad22dcae9552c51351ccd444f8f19c34daf0f518cfd1ff0a3b1a4edca"} Apr 23 14:02:49.962855 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:49.962435 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/must-gather-wlw7j" event={"ID":"34b5800b-5cf3-4de0-ba61-1fbd49181b88","Type":"ContainerStarted","Data":"736de8c2cc87d705660f8ee268abc469c7d175c262a15cc737b388186cd075d3"} Apr 23 14:02:50.969028 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:50.968978 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/must-gather-wlw7j" event={"ID":"34b5800b-5cf3-4de0-ba61-1fbd49181b88","Type":"ContainerStarted","Data":"3902c84bdd582f99dd9a0ca58f2c8623177f229eac12ed96c5d1c08db9197fd2"} Apr 23 14:02:50.990456 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:50.990405 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s9qb/must-gather-wlw7j" podStartSLOduration=2.153877351 podStartE2EDuration="2.99038773s" podCreationTimestamp="2026-04-23 14:02:48 +0000 UTC" firstStartedPulling="2026-04-23 14:02:48.986226142 +0000 UTC m=+1911.258155456" lastFinishedPulling="2026-04-23 14:02:49.822736508 +0000 UTC m=+1912.094665835" observedRunningTime="2026-04-23 14:02:50.990143286 +0000 UTC m=+1913.262072619" watchObservedRunningTime="2026-04-23 14:02:50.99038773 +0000 UTC m=+1913.262317063" Apr 23 14:02:51.535040 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:51.535007 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-272n2_b3934add-0113-4201-a17a-eaa6e5cbec42/global-pull-secret-syncer/0.log" Apr 23 14:02:51.688256 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:51.688228 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-qnswk_991e4c70-1167-498d-8923-054a866e840b/konnectivity-agent/0.log" Apr 23 14:02:51.772331 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:51.772300 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-133-33.ec2.internal_377b4825cda22d155cdffc4e7cfa1c2e/haproxy/0.log" Apr 23 14:02:55.805513 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:55.805407 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_limitador-operator-controller-manager-c7fb4c8d5-tkjgm_45563efb-424c-493a-b947-946985c787f6/manager/0.log" Apr 23 14:02:57.025838 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.025804 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6ffj6_97bff001-f602-4b45-914b-959cae86353d/kube-state-metrics/0.log" Apr 23 14:02:57.053249 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.053223 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6ffj6_97bff001-f602-4b45-914b-959cae86353d/kube-rbac-proxy-main/0.log" Apr 23 14:02:57.086065 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.085962 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_kube-state-metrics-69db897b98-6ffj6_97bff001-f602-4b45-914b-959cae86353d/kube-rbac-proxy-self/0.log" Apr 23 14:02:57.145797 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.145763 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-9czs9_791eaf8b-2188-43eb-8223-cc415e1fd93f/monitoring-plugin/0.log" Apr 23 14:02:57.185455 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.185421 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bndgx_4a3714a4-0aab-49d4-9386-940e1b4abedf/node-exporter/0.log" Apr 23 14:02:57.221747 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.221582 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bndgx_4a3714a4-0aab-49d4-9386-940e1b4abedf/kube-rbac-proxy/0.log" Apr 23 14:02:57.250961 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.250937 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-bndgx_4a3714a4-0aab-49d4-9386-940e1b4abedf/init-textfile/0.log" Apr 23 14:02:57.491696 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.491665 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6nr9w_5a35ac1c-e120-48ab-9e65-e5eb9465a9ab/kube-rbac-proxy-main/0.log" Apr 23 14:02:57.522894 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.522860 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6nr9w_5a35ac1c-e120-48ab-9e65-e5eb9465a9ab/kube-rbac-proxy-self/0.log" Apr 23 14:02:57.563845 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.563748 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_openshift-state-metrics-9d44df66c-6nr9w_5a35ac1c-e120-48ab-9e65-e5eb9465a9ab/openshift-state-metrics/0.log" Apr 23 14:02:57.629143 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.629047 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/prometheus/0.log" Apr 23 14:02:57.677315 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.677262 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/config-reloader/0.log" Apr 23 14:02:57.714181 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.714111 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/thanos-sidecar/0.log" Apr 23 14:02:57.748132 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.748099 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/kube-rbac-proxy-web/0.log" Apr 23 14:02:57.779485 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.779443 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/kube-rbac-proxy/0.log" Apr 23 14:02:57.807368 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.807343 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/kube-rbac-proxy-thanos/0.log" Apr 23 14:02:57.842104 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.842066 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-k8s-0_a9040ede-a7ad-4c03-9f92-a21411de4988/init-config-reloader/0.log" Apr 23 14:02:57.968922 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:57.968882 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-fd27q_b6353867-b800-4aa4-95f5-471fb682e788/prometheus-operator-admission-webhook/0.log" Apr 23 14:02:58.151331 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:58.151252 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4c896dd7-mpf47_5253e9cf-063e-45ac-a8c8-f8a909c4003a/thanos-query/0.log" Apr 23 14:02:58.177849 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:58.177779 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4c896dd7-mpf47_5253e9cf-063e-45ac-a8c8-f8a909c4003a/kube-rbac-proxy-web/0.log" Apr 23 14:02:58.210653 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:58.210621 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4c896dd7-mpf47_5253e9cf-063e-45ac-a8c8-f8a909c4003a/kube-rbac-proxy/0.log" Apr 23 14:02:58.239343 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:58.239313 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4c896dd7-mpf47_5253e9cf-063e-45ac-a8c8-f8a909c4003a/prom-label-proxy/0.log" Apr 23 14:02:58.272304 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:58.272280 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4c896dd7-mpf47_5253e9cf-063e-45ac-a8c8-f8a909c4003a/kube-rbac-proxy-rules/0.log" Apr 23 14:02:58.303967 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:58.303923 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-7b4c896dd7-mpf47_5253e9cf-063e-45ac-a8c8-f8a909c4003a/kube-rbac-proxy-metrics/0.log" Apr 23 14:02:59.540899 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:02:59.540873 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-console_networking-console-plugin-cb95c66f6-48vsk_26d41729-e489-4e6c-997e-ca85d3402bba/networking-console-plugin/0.log" Apr 23 14:03:00.716982 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.716947 2581 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg"] Apr 23 14:03:00.721263 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.721236 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.730512 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.730475 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg"] Apr 23 14:03:00.890640 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.890594 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-podres\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.890847 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.890768 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-sys\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.890847 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.890827 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-proc\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.890944 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.890856 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-lib-modules\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.890944 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.890900 2581 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hp6s9\" (UniqueName: \"kubernetes.io/projected/0660b39d-aa92-42dc-bbf6-9bcea08b079c-kube-api-access-hp6s9\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992460 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992361 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-sys\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992460 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992428 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-proc\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992460 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992452 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-lib-modules\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992771 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992486 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hp6s9\" (UniqueName: \"kubernetes.io/projected/0660b39d-aa92-42dc-bbf6-9bcea08b079c-kube-api-access-hp6s9\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992771 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992559 2581 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-podres\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992771 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992709 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-podres\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992936 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992775 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-sys\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992936 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992820 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-proc\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:00.992936 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:00.992906 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0660b39d-aa92-42dc-bbf6-9bcea08b079c-lib-modules\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:01.002066 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:01.002033 2581 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hp6s9\" (UniqueName: \"kubernetes.io/projected/0660b39d-aa92-42dc-bbf6-9bcea08b079c-kube-api-access-hp6s9\") pod \"perf-node-gather-daemonset-p4nlg\" (UID: \"0660b39d-aa92-42dc-bbf6-9bcea08b079c\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:01.037419 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:01.037387 2581 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:01.198488 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:01.198462 2581 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg"] Apr 23 14:03:01.201170 ip-10-0-133-33 kubenswrapper[2581]: W0423 14:03:01.201125 2581 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod0660b39d_aa92_42dc_bbf6_9bcea08b079c.slice/crio-253cd1e6e4d06ff9cd63bfc60a3cb2fb0a6b6be272edf7f205545d3cd19dd6e2 WatchSource:0}: Error finding container 253cd1e6e4d06ff9cd63bfc60a3cb2fb0a6b6be272edf7f205545d3cd19dd6e2: Status 404 returned error can't find the container with id 253cd1e6e4d06ff9cd63bfc60a3cb2fb0a6b6be272edf7f205545d3cd19dd6e2 Apr 23 14:03:01.923056 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:01.923013 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-785s4_0b4ad106-4e39-4a55-96b5-d5f06ffb38f4/dns/0.log" Apr 23 14:03:01.958051 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:01.958023 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-785s4_0b4ad106-4e39-4a55-96b5-d5f06ffb38f4/kube-rbac-proxy/0.log" Apr 23 14:03:02.019445 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:02.019403 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" event={"ID":"0660b39d-aa92-42dc-bbf6-9bcea08b079c","Type":"ContainerStarted","Data":"239cb68ab07088c6b0884375629e4c1ec317abef914e1bab5b4d9ecf89847b05"} Apr 23 14:03:02.019445 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:02.019437 2581 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" event={"ID":"0660b39d-aa92-42dc-bbf6-9bcea08b079c","Type":"ContainerStarted","Data":"253cd1e6e4d06ff9cd63bfc60a3cb2fb0a6b6be272edf7f205545d3cd19dd6e2"} Apr 23 14:03:02.019662 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:02.019536 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:02.049356 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:02.049306 2581 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" podStartSLOduration=2.049286251 podStartE2EDuration="2.049286251s" podCreationTimestamp="2026-04-23 14:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:02.046527632 +0000 UTC m=+1924.318456966" watchObservedRunningTime="2026-04-23 14:03:02.049286251 +0000 UTC m=+1924.321215586" Apr 23 14:03:02.090499 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:02.090474 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-57mv2_72a4caa3-d1c9-4761-adeb-e08cb9c63ab4/dns-node-resolver/0.log" Apr 23 14:03:02.707051 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:02.707022 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-6lrl9_b566b258-1bd9-4623-93bd-ae1931a2bc34/node-ca/0.log" Apr 23 14:03:04.161324 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:04.161282 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-tzxth_0d8b2cf1-4023-4221-a610-b0935e9dd17c/serve-healthcheck-canary/0.log" Apr 23 14:03:04.644928 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:04.644894 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gzhr9_d18e4b4d-4c24-4614-b7e1-e4d9ff536c14/insights-operator/0.log" Apr 23 14:03:04.645269 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:04.645250 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-operator-585dfdc468-gzhr9_d18e4b4d-4c24-4614-b7e1-e4d9ff536c14/insights-operator/1.log" Apr 23 14:03:04.735353 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:04.735320 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c5px5_2606ab5f-c4f7-40b8-8265-6068a4813e3f/kube-rbac-proxy/0.log" Apr 23 14:03:04.764681 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:04.764656 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c5px5_2606ab5f-c4f7-40b8-8265-6068a4813e3f/exporter/0.log" Apr 23 14:03:04.788557 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:04.788525 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-c5px5_2606ab5f-c4f7-40b8-8265-6068a4813e3f/extractor/0.log" Apr 23 14:03:07.706765 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:07.706738 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-868f457486-lrj4h_111f2caf-19d2-44c6-ba31-60a71923b291/manager/0.log" Apr 23 14:03:07.764212 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:07.764182 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-5w2b6_493aaa1c-539a-4784-9d88-fb684ffb4971/openshift-lws-operator/0.log" Apr 23 14:03:08.041311 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:08.041221 2581 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-p4nlg" Apr 23 14:03:08.311336 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:08.311271 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_kserve-controller-manager-6b667fdd66-srkvt_474680a9-b7bc-4845-94e6-6cf0a462a53b/manager/0.log" Apr 23 14:03:08.421434 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:08.421409 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_model-serving-api-86f7b4b499-vl94g_b3562eb0-0a43-4b3f-8b7b-6d5b4346cadb/server/0.log" Apr 23 14:03:13.915986 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:13.915958 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xmbzm_e1215a93-a289-40d1-8bf9-bcdfac128f1a/migrator/0.log" Apr 23 14:03:13.943910 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:13.943871 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator_migrator-74bb7799d9-xmbzm_e1215a93-a289-40d1-8bf9-bcdfac128f1a/graceful-termination/0.log" Apr 23 14:03:14.501802 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:14.501742 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f7btc_e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc/kube-storage-version-migrator-operator/1.log" Apr 23 14:03:14.502820 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:14.502800 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-kube-storage-version-migrator-operator_kube-storage-version-migrator-operator-6769c5d45-f7btc_e0a1eb62-d4d4-431f-95e0-db89b3e5d7cc/kube-storage-version-migrator-operator/0.log" Apr 23 14:03:16.055317 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.055289 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/kube-multus-additional-cni-plugins/0.log" Apr 23 14:03:16.085273 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.085245 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/egress-router-binary-copy/0.log" Apr 23 14:03:16.116115 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.116089 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/cni-plugins/0.log" Apr 23 14:03:16.144319 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.144290 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/bond-cni-plugin/0.log" Apr 23 14:03:16.174708 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.174678 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/routeoverride-cni/0.log" Apr 23 14:03:16.207382 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.207358 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/whereabouts-cni-bincopy/0.log" Apr 23 14:03:16.241584 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.241555 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-cskjl_74d0665e-9801-46d4-acaf-54aeb0d3ecd2/whereabouts-cni/0.log" Apr 23 14:03:16.667598 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.667570 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-vlh7k_bc88e01c-0268-427e-bd19-2df1ccdb32a0/kube-multus/0.log" Apr 23 14:03:16.707991 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.707952 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fpksp_c3fdc691-5051-4c4c-8360-ed987a28f315/network-metrics-daemon/0.log" Apr 23 14:03:16.741611 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:16.741584 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-fpksp_c3fdc691-5051-4c4c-8360-ed987a28f315/kube-rbac-proxy/0.log" Apr 23 14:03:18.161757 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:18.161724 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-controller/0.log" Apr 23 14:03:18.182250 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:18.182221 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/0.log" Apr 23 14:03:18.193840 ip-10-0-133-33 kubenswrapper[2581]: I0423 14:03:18.193812 2581 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-kgrc5_a139498f-5c4f-4db0-a95e-1d466b43fc87/ovn-acl-logging/1.log"