Apr 22 14:12:23.644863 ip-10-0-132-130 systemd[1]: kubelet.service: Failed to load environment files: No such file or directory Apr 22 14:12:23.644872 ip-10-0-132-130 systemd[1]: kubelet.service: Failed to run 'start-pre' task: No such file or directory Apr 22 14:12:23.644879 ip-10-0-132-130 systemd[1]: kubelet.service: Failed with result 'resources'. Apr 22 14:12:23.645101 ip-10-0-132-130 systemd[1]: Failed to start Kubernetes Kubelet. Apr 22 14:12:33.862361 ip-10-0-132-130 systemd[1]: kubelet.service: Failed to schedule restart job: Unit crio.service not found. Apr 22 14:12:33.862375 ip-10-0-132-130 systemd[1]: kubelet.service: Failed with result 'resources'. -- Boot f4ae33ce175f4175af47a00ec76d6a3a -- Apr 22 14:14:59.189651 ip-10-0-132-130 systemd[1]: Starting Kubernetes Kubelet... Apr 22 14:14:59.678869 ip-10-0-132-130 kubenswrapper[2578]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:14:59.678869 ip-10-0-132-130 kubenswrapper[2578]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 22 14:14:59.678869 ip-10-0-132-130 kubenswrapper[2578]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:14:59.678869 ip-10-0-132-130 kubenswrapper[2578]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 22 14:14:59.678869 ip-10-0-132-130 kubenswrapper[2578]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 22 14:14:59.682404 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.682314 2578 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690125 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690148 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690151 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690154 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690157 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690160 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690163 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:59.690156 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690166 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690171 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690176 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690179 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690182 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690185 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690188 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690191 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690194 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690196 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690199 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690202 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690205 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690207 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690210 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690212 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690215 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690217 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690220 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:59.690455 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690222 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690225 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690230 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690233 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690236 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690238 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690241 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690243 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690246 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690249 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690252 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690255 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690257 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690260 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690262 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690266 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690269 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690272 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690274 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690277 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:59.690908 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690280 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690282 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690285 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690288 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690291 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690293 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690309 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690312 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690315 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690317 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690320 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690323 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690325 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690328 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690330 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690333 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690336 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690338 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690341 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690345 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:59.691447 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690347 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690349 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690352 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690355 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690357 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690360 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690362 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690367 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690370 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690372 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690375 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690377 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690381 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690384 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690386 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690389 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690391 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690394 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690398 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:59.691937 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690402 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690804 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690809 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690812 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690815 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690818 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690821 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690824 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690827 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690830 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690833 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690835 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690838 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690841 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690843 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690846 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690849 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690851 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690854 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:59.692481 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690856 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690859 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690862 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690865 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690867 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690870 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690873 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690877 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690879 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690882 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690884 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690887 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690889 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690892 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690895 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690897 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690900 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690902 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690905 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690907 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:59.692935 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690910 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690912 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690916 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690918 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690921 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690924 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690927 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690929 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690932 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690934 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690937 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690939 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690942 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690945 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690948 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690950 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690952 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690955 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690957 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690961 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:59.693448 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690963 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690965 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690968 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690970 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690973 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690975 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690978 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690980 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690982 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690985 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690988 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690990 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690993 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.690997 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691001 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691004 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691006 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691009 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691011 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:59.693939 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691014 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691016 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691021 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691024 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691027 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691030 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691033 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691036 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.691041 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693704 2578 flags.go:64] FLAG: --address="0.0.0.0" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693716 2578 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693723 2578 flags.go:64] FLAG: --anonymous-auth="true" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693728 2578 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693734 2578 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693738 2578 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693743 2578 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693747 2578 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693751 2578 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693754 2578 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693758 2578 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693761 2578 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 22 14:14:59.694429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693764 2578 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693768 2578 flags.go:64] FLAG: --cgroup-root="" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693771 2578 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693774 2578 flags.go:64] FLAG: --client-ca-file="" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693777 2578 flags.go:64] FLAG: --cloud-config="" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693780 2578 flags.go:64] FLAG: --cloud-provider="external" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693783 2578 flags.go:64] FLAG: --cluster-dns="[]" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693788 2578 flags.go:64] FLAG: --cluster-domain="" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693791 2578 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693794 2578 flags.go:64] FLAG: --config-dir="" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693797 2578 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693801 2578 flags.go:64] FLAG: --container-log-max-files="5" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693805 2578 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693808 2578 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693812 2578 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693815 2578 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693819 2578 flags.go:64] FLAG: --contention-profiling="false" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693822 2578 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693825 2578 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693829 2578 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693832 2578 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693837 2578 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693840 2578 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693843 2578 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693846 2578 flags.go:64] FLAG: --enable-load-reader="false" Apr 22 14:14:59.694941 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693850 2578 flags.go:64] FLAG: --enable-server="true" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693853 2578 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693858 2578 flags.go:64] FLAG: --event-burst="100" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693861 2578 flags.go:64] FLAG: --event-qps="50" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693864 2578 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693867 2578 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693870 2578 flags.go:64] FLAG: --eviction-hard="" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693874 2578 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693877 2578 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693880 2578 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693883 2578 flags.go:64] FLAG: --eviction-soft="" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693886 2578 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693889 2578 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693892 2578 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693895 2578 flags.go:64] FLAG: --experimental-mounter-path="" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693899 2578 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693902 2578 flags.go:64] FLAG: --fail-swap-on="true" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693905 2578 flags.go:64] FLAG: --feature-gates="" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693909 2578 flags.go:64] FLAG: --file-check-frequency="20s" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693912 2578 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693915 2578 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693919 2578 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693922 2578 flags.go:64] FLAG: --healthz-port="10248" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693926 2578 flags.go:64] FLAG: --help="false" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693929 2578 flags.go:64] FLAG: --hostname-override="ip-10-0-132-130.ec2.internal" Apr 22 14:14:59.695561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693932 2578 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693936 2578 flags.go:64] FLAG: --http-check-frequency="20s" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693939 2578 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693942 2578 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693946 2578 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693950 2578 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693953 2578 flags.go:64] FLAG: --image-service-endpoint="" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693956 2578 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693959 2578 flags.go:64] FLAG: --kube-api-burst="100" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693963 2578 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693966 2578 flags.go:64] FLAG: --kube-api-qps="50" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693969 2578 flags.go:64] FLAG: --kube-reserved="" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693972 2578 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693975 2578 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693978 2578 flags.go:64] FLAG: --kubelet-cgroups="" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693981 2578 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693983 2578 flags.go:64] FLAG: --lock-file="" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693986 2578 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693989 2578 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693992 2578 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.693998 2578 flags.go:64] FLAG: --log-json-split-stream="false" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694001 2578 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694004 2578 flags.go:64] FLAG: --log-text-split-stream="false" Apr 22 14:14:59.696196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694007 2578 flags.go:64] FLAG: --logging-format="text" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694010 2578 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694014 2578 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694017 2578 flags.go:64] FLAG: --manifest-url="" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694020 2578 flags.go:64] FLAG: --manifest-url-header="" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694025 2578 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694028 2578 flags.go:64] FLAG: --max-open-files="1000000" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694032 2578 flags.go:64] FLAG: --max-pods="110" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694035 2578 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694038 2578 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694041 2578 flags.go:64] FLAG: --memory-manager-policy="None" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694045 2578 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694048 2578 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694051 2578 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694053 2578 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694062 2578 flags.go:64] FLAG: --node-status-max-images="50" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694065 2578 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694068 2578 flags.go:64] FLAG: --oom-score-adj="-999" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694072 2578 flags.go:64] FLAG: --pod-cidr="" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694075 2578 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694080 2578 flags.go:64] FLAG: --pod-manifest-path="" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694084 2578 flags.go:64] FLAG: --pod-max-pids="-1" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694087 2578 flags.go:64] FLAG: --pods-per-core="0" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694091 2578 flags.go:64] FLAG: --port="10250" Apr 22 14:14:59.696777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694094 2578 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694097 2578 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-007ca3887721d0da8" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694101 2578 flags.go:64] FLAG: --qos-reserved="" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694104 2578 flags.go:64] FLAG: --read-only-port="10255" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694107 2578 flags.go:64] FLAG: --register-node="true" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694110 2578 flags.go:64] FLAG: --register-schedulable="true" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694113 2578 flags.go:64] FLAG: --register-with-taints="" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694116 2578 flags.go:64] FLAG: --registry-burst="10" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694119 2578 flags.go:64] FLAG: --registry-qps="5" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694122 2578 flags.go:64] FLAG: --reserved-cpus="" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694125 2578 flags.go:64] FLAG: --reserved-memory="" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694129 2578 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694132 2578 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694135 2578 flags.go:64] FLAG: --rotate-certificates="false" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694138 2578 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694140 2578 flags.go:64] FLAG: --runonce="false" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694143 2578 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694146 2578 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694149 2578 flags.go:64] FLAG: --seccomp-default="false" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694152 2578 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694155 2578 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694158 2578 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694161 2578 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694164 2578 flags.go:64] FLAG: --storage-driver-password="root" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694168 2578 flags.go:64] FLAG: --storage-driver-secure="false" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694171 2578 flags.go:64] FLAG: --storage-driver-table="stats" Apr 22 14:14:59.697382 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694173 2578 flags.go:64] FLAG: --storage-driver-user="root" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694177 2578 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694180 2578 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694183 2578 flags.go:64] FLAG: --system-cgroups="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694186 2578 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694191 2578 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694194 2578 flags.go:64] FLAG: --tls-cert-file="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694197 2578 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694202 2578 flags.go:64] FLAG: --tls-min-version="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694205 2578 flags.go:64] FLAG: --tls-private-key-file="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694208 2578 flags.go:64] FLAG: --topology-manager-policy="none" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694211 2578 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694214 2578 flags.go:64] FLAG: --topology-manager-scope="container" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694216 2578 flags.go:64] FLAG: --v="2" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694221 2578 flags.go:64] FLAG: --version="false" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694225 2578 flags.go:64] FLAG: --vmodule="" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694229 2578 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.694233 2578 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694346 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694351 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694354 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694356 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694359 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694362 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:59.697995 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694366 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694369 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694372 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694374 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694377 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694380 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694382 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694385 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694388 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694390 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694394 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694396 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694399 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694401 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694404 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694406 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694409 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694411 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694415 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694417 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:59.698594 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694420 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694422 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694425 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694428 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694431 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694434 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694436 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694439 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694443 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694445 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694448 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694451 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694456 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694460 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694465 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694469 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694472 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694475 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694478 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:59.699104 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694481 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694484 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694486 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694489 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694492 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694495 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694498 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694501 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694503 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694506 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694508 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694511 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694513 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694516 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694519 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694521 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694524 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694527 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694529 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694531 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:59.699630 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694534 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694537 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694539 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694542 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694544 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694548 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694551 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694553 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694556 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694558 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694561 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694563 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694566 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694569 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694571 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694574 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694577 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694583 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694585 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:59.700122 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694588 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:59.700697 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.694591 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:59.700697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.695347 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:14:59.703643 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.703621 2578 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 22 14:14:59.703694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.703643 2578 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 22 14:14:59.703694 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703692 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703697 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703701 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703704 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703708 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703711 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703714 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703717 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703720 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703723 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703726 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703729 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703732 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703735 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703737 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703740 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703742 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703745 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703748 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703752 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:59.703748 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703754 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703757 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703760 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703763 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703766 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703768 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703771 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703773 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703776 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703778 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703781 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703785 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703787 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703790 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703793 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703795 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703798 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703801 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703806 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:59.704237 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703809 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703811 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703814 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703816 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703819 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703821 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703824 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703826 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703829 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703831 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703834 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703837 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703841 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703844 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703847 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703851 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703854 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703857 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703860 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:59.704756 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703862 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703865 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703867 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703870 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703872 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703875 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703878 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703881 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703883 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703886 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703888 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703891 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703893 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703896 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703899 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703901 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703904 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703906 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703909 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703911 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:59.705221 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703913 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703916 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703919 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703921 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703924 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703927 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703929 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.703931 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.703937 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704035 2578 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704040 2578 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704043 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704046 2578 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704049 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704053 2578 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 22 14:14:59.705721 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704057 2578 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704060 2578 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704062 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704065 2578 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704068 2578 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704071 2578 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704075 2578 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704078 2578 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704081 2578 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704084 2578 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704086 2578 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704089 2578 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704092 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704094 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704097 2578 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704100 2578 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704102 2578 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704105 2578 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704108 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 22 14:14:59.706102 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704111 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704114 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704116 2578 feature_gate.go:328] unrecognized feature gate: Example2 Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704119 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704122 2578 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704125 2578 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704127 2578 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704130 2578 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704132 2578 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704135 2578 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704137 2578 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704140 2578 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704143 2578 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704145 2578 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704148 2578 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704150 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704153 2578 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704155 2578 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704158 2578 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704160 2578 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 22 14:14:59.706584 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704163 2578 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704165 2578 feature_gate.go:328] unrecognized feature gate: Example Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704169 2578 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704171 2578 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704173 2578 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704176 2578 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704178 2578 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704181 2578 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704183 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704186 2578 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704188 2578 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704191 2578 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704193 2578 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704196 2578 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704198 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704201 2578 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704204 2578 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704206 2578 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704209 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704211 2578 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 22 14:14:59.707070 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704214 2578 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704217 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704219 2578 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704222 2578 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704224 2578 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704226 2578 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704229 2578 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704231 2578 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704234 2578 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704237 2578 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704239 2578 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704242 2578 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704245 2578 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704248 2578 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704250 2578 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704253 2578 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704255 2578 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704258 2578 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704261 2578 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704263 2578 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 22 14:14:59.707581 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:14:59.704265 2578 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 22 14:14:59.708093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.704270 2578 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 22 14:14:59.708093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.705092 2578 server.go:962] "Client rotation is on, will bootstrap in background" Apr 22 14:14:59.709522 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.709507 2578 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 22 14:14:59.710597 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.710575 2578 server.go:1019] "Starting client certificate rotation" Apr 22 14:14:59.710704 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.710685 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:14:59.710739 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.710730 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 22 14:14:59.741170 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.741147 2578 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:14:59.748268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.748241 2578 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 22 14:14:59.765708 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.765682 2578 log.go:25] "Validated CRI v1 runtime API" Apr 22 14:14:59.772524 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.772504 2578 log.go:25] "Validated CRI v1 image API" Apr 22 14:14:59.773762 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.773747 2578 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 22 14:14:59.776425 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.776404 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:14:59.780327 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.780284 2578 fs.go:135] Filesystem UUIDs: map[7B77-95E7:/dev/nvme0n1p2 93205ef8-e82d-4670-bf08-fc70de02cb80:/dev/nvme0n1p4 c2ca727b-843e-4f37-ae7e-38947131df44:/dev/nvme0n1p3] Apr 22 14:14:59.780397 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.780325 2578 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 22 14:14:59.786433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.786315 2578 manager.go:217] Machine: {Timestamp:2026-04-22 14:14:59.784049769 +0000 UTC m=+0.458853025 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3051088 MemoryCapacity:33164488704 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec239c99002d822bb9746f7d3d33941f SystemUUID:ec239c99-002d-822b-b974-6f7d3d33941f BootID:f4ae33ce-175f-4175-af47-a00ec76d6a3a Filesystems:[{Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true} {Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582242304 Type:vfs Inodes:4048399 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:1b:07:a8:ae:65 Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:1b:07:a8:ae:65 Speed:0 Mtu:9001} {Name:ovs-system MacAddress:c6:f9:6c:40:c0:58 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164488704 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 22 14:14:59.786433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.786428 2578 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 22 14:14:59.786551 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.786516 2578 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 22 14:14:59.787729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.787706 2578 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 22 14:14:59.787889 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.787731 2578 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-132-130.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 22 14:14:59.787939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.787900 2578 topology_manager.go:138] "Creating topology manager with none policy" Apr 22 14:14:59.787939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.787910 2578 container_manager_linux.go:306] "Creating device plugin manager" Apr 22 14:14:59.787939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.787927 2578 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:14:59.788015 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.787940 2578 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 22 14:14:59.789341 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.789330 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:14:59.789624 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.789614 2578 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 22 14:14:59.791762 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.791746 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xltg" Apr 22 14:14:59.792060 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.792050 2578 kubelet.go:491] "Attempting to sync node with API server" Apr 22 14:14:59.792097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.792064 2578 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 22 14:14:59.792097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.792076 2578 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 22 14:14:59.792097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.792085 2578 kubelet.go:397] "Adding apiserver pod source" Apr 22 14:14:59.792097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.792093 2578 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 22 14:14:59.794493 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.794481 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:14:59.794540 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.794502 2578 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 22 14:14:59.798724 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.798707 2578 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 22 14:14:59.799093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.799078 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-8xltg" Apr 22 14:14:59.800439 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.800423 2578 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 22 14:14:59.802442 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.802418 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 22 14:14:59.802518 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802495 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 22 14:14:59.802518 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802511 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802518 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.802518 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-132-130.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802528 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802538 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802546 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802555 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802561 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802570 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802576 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802585 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 22 14:14:59.802593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.802594 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 22 14:14:59.803450 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.803440 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 22 14:14:59.803450 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.803450 2578 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 22 14:14:59.807733 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.807715 2578 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 22 14:14:59.807823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.807764 2578 server.go:1295] "Started kubelet" Apr 22 14:14:59.807897 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.807865 2578 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 22 14:14:59.807931 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.807854 2578 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 22 14:14:59.807964 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.807940 2578 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 22 14:14:59.808556 ip-10-0-132-130 systemd[1]: Started Kubernetes Kubelet. Apr 22 14:14:59.810100 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.810079 2578 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 22 14:14:59.810949 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.810931 2578 server.go:317] "Adding debug handlers to kubelet server" Apr 22 14:14:59.818272 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.818251 2578 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 22 14:14:59.819885 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.819868 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-130.ec2.internal" not found Apr 22 14:14:59.819949 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.819939 2578 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 22 14:14:59.820533 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.820519 2578 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 22 14:14:59.821041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821017 2578 factory.go:55] Registering systemd factory Apr 22 14:14:59.821114 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821045 2578 factory.go:223] Registration of the systemd container factory successfully Apr 22 14:14:59.821286 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821258 2578 factory.go:153] Registering CRI-O factory Apr 22 14:14:59.821286 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821273 2578 factory.go:223] Registration of the crio container factory successfully Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.821281 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821328 2578 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821340 2578 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821343 2578 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821367 2578 factory.go:103] Registering Raw factory Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821383 2578 manager.go:1196] Started watching for new ooms in manager Apr 22 14:14:59.821462 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821329 2578 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 22 14:14:59.821900 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821479 2578 reconstruct.go:97] "Volume reconstruction finished" Apr 22 14:14:59.821900 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821504 2578 reconciler.go:26] "Reconciler: start to sync state" Apr 22 14:14:59.821900 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.821804 2578 manager.go:319] Starting recovery of all containers Apr 22 14:14:59.823145 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.823126 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:59.825547 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.825522 2578 nodelease.go:49] "Failed to get node when trying to set owner ref to the node lease" err="nodes \"ip-10-0-132-130.ec2.internal\" not found" node="ip-10-0-132-130.ec2.internal" Apr 22 14:14:59.833451 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.833207 2578 manager.go:324] Recovery completed Apr 22 14:14:59.836468 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.836447 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-130.ec2.internal" not found Apr 22 14:14:59.837676 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.837663 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:59.840441 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.840424 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:59.840504 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.840454 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:59.840504 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.840464 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:59.841003 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.840988 2578 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 22 14:14:59.841003 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.841000 2578 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 22 14:14:59.841120 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.841018 2578 state_mem.go:36] "Initialized new in-memory state store" Apr 22 14:14:59.843376 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.843364 2578 policy_none.go:49] "None policy: Start" Apr 22 14:14:59.843430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.843380 2578 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 22 14:14:59.843430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.843390 2578 state_mem.go:35] "Initializing new in-memory state store" Apr 22 14:14:59.875930 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.875912 2578 manager.go:341] "Starting Device Plugin manager" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.875952 2578 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.875965 2578 server.go:85] "Starting device plugin registration server" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.876196 2578 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.876207 2578 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.876314 2578 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.876394 2578 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.876402 2578 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.876941 2578 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.876981 2578 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:14:59.897480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.895332 2578 nodeinfomanager.go:417] Failed to publish CSINode: nodes "ip-10-0-132-130.ec2.internal" not found Apr 22 14:14:59.945234 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.945150 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 22 14:14:59.946558 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.946539 2578 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 22 14:14:59.946651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.946566 2578 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 22 14:14:59.946651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.946585 2578 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 22 14:14:59.946651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.946592 2578 kubelet.go:2451] "Starting kubelet main sync loop" Apr 22 14:14:59.946651 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.946625 2578 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 22 14:14:59.950309 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.950277 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:14:59.976349 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.976325 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:14:59.977475 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.977462 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:14:59.977543 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.977490 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:14:59.977543 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.977502 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:14:59.977543 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.977533 2578 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-132-130.ec2.internal" Apr 22 14:14:59.988185 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:14:59.988167 2578 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-132-130.ec2.internal" Apr 22 14:14:59.988232 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:14:59.988189 2578 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-132-130.ec2.internal\": node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.005242 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.005217 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.047395 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.047342 2578 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal","kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal"] Apr 22 14:15:00.047479 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.047442 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:00.048432 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.048417 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:00.048502 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.048445 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:00.048502 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.048455 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:00.049725 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.049712 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:00.049922 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.049907 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.049956 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.049949 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:00.050560 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.050528 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:00.050560 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.050551 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:00.050560 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.050561 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:00.050729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.050624 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:00.050729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.050645 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:00.050729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.050657 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:00.051651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.051637 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.051706 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.051663 2578 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 22 14:15:00.054886 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.054871 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientMemory" Apr 22 14:15:00.054967 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.054896 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasNoDiskPressure" Apr 22 14:15:00.054967 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.054909 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeHasSufficientPID" Apr 22 14:15:00.066864 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.066839 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-130.ec2.internal\" not found" node="ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.071946 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.071922 2578 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-132-130.ec2.internal\" not found" node="ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.106123 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.106095 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.124161 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.124136 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/20438f0448f48ac5e6fdc84ad54ce303-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal\" (UID: \"20438f0448f48ac5e6fdc84ad54ce303\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.124252 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.124164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20438f0448f48ac5e6fdc84ad54ce303-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal\" (UID: \"20438f0448f48ac5e6fdc84ad54ce303\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.124252 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.124185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/973f3092b03452d8285b08e0d93dce0b-config\") pod \"kube-apiserver-proxy-ip-10-0-132-130.ec2.internal\" (UID: \"973f3092b03452d8285b08e0d93dce0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.206751 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.206671 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.225129 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.225100 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/20438f0448f48ac5e6fdc84ad54ce303-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal\" (UID: \"20438f0448f48ac5e6fdc84ad54ce303\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.225242 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.225139 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20438f0448f48ac5e6fdc84ad54ce303-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal\" (UID: \"20438f0448f48ac5e6fdc84ad54ce303\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.225242 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.225165 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/973f3092b03452d8285b08e0d93dce0b-config\") pod \"kube-apiserver-proxy-ip-10-0-132-130.ec2.internal\" (UID: \"973f3092b03452d8285b08e0d93dce0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.225242 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.225193 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/20438f0448f48ac5e6fdc84ad54ce303-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal\" (UID: \"20438f0448f48ac5e6fdc84ad54ce303\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.225242 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.225222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/973f3092b03452d8285b08e0d93dce0b-config\") pod \"kube-apiserver-proxy-ip-10-0-132-130.ec2.internal\" (UID: \"973f3092b03452d8285b08e0d93dce0b\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.225405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.225205 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20438f0448f48ac5e6fdc84ad54ce303-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal\" (UID: \"20438f0448f48ac5e6fdc84ad54ce303\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.307455 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.307422 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.369932 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.369896 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.374525 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.374503 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" Apr 22 14:15:00.408172 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.408139 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.508725 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.508643 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.609241 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.609209 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.709823 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.709792 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.709823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.709799 2578 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 22 14:15:00.710525 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.709959 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:00.710525 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.710007 2578 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 22 14:15:00.801854 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.801807 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-21 14:09:59 +0000 UTC" deadline="2027-10-11 03:19:08.61014255 +0000 UTC" Apr 22 14:15:00.801854 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.801846 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12877h4m7.808300735s" Apr 22 14:15:00.809910 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.809882 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.821080 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.821049 2578 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 22 14:15:00.848026 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.847993 2578 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 22 14:15:00.872619 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:00.872572 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod973f3092b03452d8285b08e0d93dce0b.slice/crio-a640c5a495d93b66b1b3b22c0ce8f3254cc75dab9c40c7f92148317248095cf9 WatchSource:0}: Error finding container a640c5a495d93b66b1b3b22c0ce8f3254cc75dab9c40c7f92148317248095cf9: Status 404 returned error can't find the container with id a640c5a495d93b66b1b3b22c0ce8f3254cc75dab9c40c7f92148317248095cf9 Apr 22 14:15:00.872869 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:00.872844 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20438f0448f48ac5e6fdc84ad54ce303.slice/crio-b853f048c7da4a410a9829a54aae9af016b140dc263a10f100236ab8f3335dca WatchSource:0}: Error finding container b853f048c7da4a410a9829a54aae9af016b140dc263a10f100236ab8f3335dca: Status 404 returned error can't find the container with id b853f048c7da4a410a9829a54aae9af016b140dc263a10f100236ab8f3335dca Apr 22 14:15:00.873307 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.873276 2578 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-nvrjd" Apr 22 14:15:00.875998 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.875973 2578 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:00.878774 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.878432 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:15:00.881543 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.881524 2578 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-nvrjd" Apr 22 14:15:00.910291 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:00.910254 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:00.949702 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.949649 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" event={"ID":"973f3092b03452d8285b08e0d93dce0b","Type":"ContainerStarted","Data":"a640c5a495d93b66b1b3b22c0ce8f3254cc75dab9c40c7f92148317248095cf9"} Apr 22 14:15:00.950581 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:00.950561 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" event={"ID":"20438f0448f48ac5e6fdc84ad54ce303","Type":"ContainerStarted","Data":"b853f048c7da4a410a9829a54aae9af016b140dc263a10f100236ab8f3335dca"} Apr 22 14:15:01.010788 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.010761 2578 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-132-130.ec2.internal\" not found" Apr 22 14:15:01.083554 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.083480 2578 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:01.120734 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.120708 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" Apr 22 14:15:01.133831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.133804 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:01.135596 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.135578 2578 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" Apr 22 14:15:01.144948 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.144922 2578 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 22 14:15:01.793610 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.793575 2578 apiserver.go:52] "Watching apiserver" Apr 22 14:15:01.801094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.801062 2578 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 22 14:15:01.803378 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.803348 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/konnectivity-agent-f7m86","kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9","openshift-cluster-node-tuning-operator/tuned-rkg84","openshift-image-registry/node-ca-mnpw8","openshift-multus/multus-qnk5j","openshift-network-diagnostics/network-check-target-9kmw7","openshift-dns/node-resolver-wbp94","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal","openshift-multus/multus-additional-cni-plugins-z67br","openshift-multus/network-metrics-daemon-8q2mm","openshift-network-operator/iptables-alerter-pcvj2","openshift-ovn-kubernetes/ovnkube-node-2kpzl"] Apr 22 14:15:01.804971 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.804944 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.806126 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.806105 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.807706 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.807687 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 22 14:15:01.807798 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.807694 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-x9sz9\"" Apr 22 14:15:01.808076 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.808059 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.808369 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.808341 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.808369 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.808379 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:01.808557 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.808468 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 22 14:15:01.808557 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.808472 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:01.808750 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.808728 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-4n44t\"" Apr 22 14:15:01.808905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.808889 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 22 14:15:01.811384 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.811364 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.811488 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.811400 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.813971 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.813947 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.814223 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814206 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-2b9zp\"" Apr 22 14:15:01.814336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814318 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.814397 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814349 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.814521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814502 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.814681 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814663 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 22 14:15:01.814786 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814769 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.814940 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.814925 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-ppfc2\"" Apr 22 14:15:01.816203 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.816184 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.817177 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.817155 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 22 14:15:01.817263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.817195 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 22 14:15:01.817448 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.817430 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.817524 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.817477 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.817785 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.817749 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.817785 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.817758 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-hqgwb\"" Apr 22 14:15:01.818631 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.818612 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.818739 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.818667 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.818807 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.818778 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-n4pfj\"" Apr 22 14:15:01.819051 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.819034 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:01.819121 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.819103 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:01.820031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.820011 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 22 14:15:01.820031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.820028 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 22 14:15:01.820157 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.820102 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-dzcwp\"" Apr 22 14:15:01.820728 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.820707 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.822337 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.822319 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.823682 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.823593 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.823744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.823689 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 22 14:15:01.823907 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.823887 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-97skw\"" Apr 22 14:15:01.824012 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.823890 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.825090 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825072 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 22 14:15:01.825190 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825128 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 22 14:15:01.825422 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825404 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 22 14:15:01.825496 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825427 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 22 14:15:01.825496 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825456 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 22 14:15:01.825702 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825685 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-sncpp\"" Apr 22 14:15:01.825770 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.825726 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 22 14:15:01.834953 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.834926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-cni-netd\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.835066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.834971 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-modprobe-d\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.835066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835001 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.835066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835035 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.835066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835063 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cxs6x\" (UniqueName: \"kubernetes.io/projected/20f9c88a-aaca-401a-b81d-d9a32b00d92a-kube-api-access-cxs6x\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.835271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835091 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-ovn\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.835271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.835271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835151 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4jrk9\" (UniqueName: \"kubernetes.io/projected/5678ae75-291c-4f06-82ee-c0d558cb29dc-kube-api-access-4jrk9\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.835271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835194 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-registration-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.835271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835230 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/215c8235-1207-4971-9cc3-8c7aaa57988c-agent-certs\") pod \"konnectivity-agent-f7m86\" (UID: \"215c8235-1207-4971-9cc3-8c7aaa57988c\") " pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835291 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysconfig\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835331 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-system-cni-dir\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-socket-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835440 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wbgq5\" (UniqueName: \"kubernetes.io/projected/da61a168-341e-43e4-a7d8-7b24b79c346b-kube-api-access-wbgq5\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835471 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-kubernetes\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835500 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-run\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.835526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-os-release\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20f9c88a-aaca-401a-b81d-d9a32b00d92a-cni-binary-copy\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835557 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-socket-dir-parent\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835577 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cnibin\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835603 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-system-cni-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835627 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835687 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysctl-d\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835720 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cni-binary-copy\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835747 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qnk79\" (UniqueName: \"kubernetes.io/projected/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-kube-api-access-qnk79\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835772 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d2d20229-b414-4e83-be51-e3c0fd756697-iptables-alerter-script\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835795 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de390571-ad59-463e-84ce-017e582c71b4-serviceca\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.835831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835817 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hxxrj\" (UniqueName: \"kubernetes.io/projected/0a26fb71-5407-413c-a14c-18f3085f4abf-kube-api-access-hxxrj\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835842 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-env-overrides\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835865 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-hostroot\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835887 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysctl-conf\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835910 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-os-release\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835936 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de390571-ad59-463e-84ce-017e582c71b4-host\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835958 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-twlf5\" (UniqueName: \"kubernetes.io/projected/de390571-ad59-463e-84ce-017e582c71b4-kube-api-access-twlf5\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.835981 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-cnibin\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836003 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-var-lib-kubelet\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836026 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-tuned\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836048 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-etc-selinux\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836074 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/215c8235-1207-4971-9cc3-8c7aaa57988c-konnectivity-ca\") pod \"konnectivity-agent-f7m86\" (UID: \"215c8235-1207-4971-9cc3-8c7aaa57988c\") " pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836098 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-conf-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836150 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-slash\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-systemd\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.836412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836220 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-systemd\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836244 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2d20229-b414-4e83-be51-e3c0fd756697-host-slash\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836266 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-cni-bin\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836290 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-node-log\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836341 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-log-socket\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovnkube-script-lib\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836390 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-k8s-cni-cncf-io\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836414 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-pfzvk\" (UniqueName: \"kubernetes.io/projected/aee73a14-6669-4d65-8987-69628270ae6d-kube-api-access-pfzvk\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-cni-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-systemd-units\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836482 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-var-lib-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836504 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovnkube-config\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836528 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-sys-fs\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836551 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-sys\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836573 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-host\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836600 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-multus-certs\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a26fb71-5407-413c-a14c-18f3085f4abf-tmp-dir\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.837189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-run-netns\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836696 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-cni-multus\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-kubelet\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836748 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836774 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836800 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a26fb71-5407-413c-a14c-18f3085f4abf-hosts-file\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836832 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-kubelet\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836863 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-cni-bin\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-device-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.836912 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837366 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rq7fq\" (UniqueName: \"kubernetes.io/projected/d2d20229-b414-4e83-be51-e3c0fd756697-kube-api-access-rq7fq\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837424 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-etc-kubernetes\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-etc-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837495 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovn-node-metrics-cert\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837517 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-lib-modules\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837555 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f99e885-249c-4aad-bcc4-6ad66292dd2f-tmp\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.838074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837584 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-clq2j\" (UniqueName: \"kubernetes.io/projected/0f99e885-249c-4aad-bcc4-6ad66292dd2f-kube-api-access-clq2j\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.838823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-netns\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.838823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.837646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-daemon-config\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.882634 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.882601 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:00 +0000 UTC" deadline="2027-10-30 07:46:26.552414301 +0000 UTC" Apr 22 14:15:01.882634 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.882628 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="13337h31m24.669788799s" Apr 22 14:15:01.922224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.922190 2578 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 22 14:15:01.938055 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-modprobe-d\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938055 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938087 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.938274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cxs6x\" (UniqueName: \"kubernetes.io/projected/20f9c88a-aaca-401a-b81d-d9a32b00d92a-kube-api-access-cxs6x\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.938274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938145 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-ovn\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-modprobe-d\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938200 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-ovn\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938188 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-run-ovn-kubernetes\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938275 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4jrk9\" (UniqueName: \"kubernetes.io/projected/5678ae75-291c-4f06-82ee-c0d558cb29dc-kube-api-access-4jrk9\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938373 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-registration-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938419 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/215c8235-1207-4971-9cc3-8c7aaa57988c-agent-certs\") pod \"konnectivity-agent-f7m86\" (UID: \"215c8235-1207-4971-9cc3-8c7aaa57988c\") " pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938477 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-registration-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938506 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysconfig\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938531 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-system-cni-dir\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.938583 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-socket-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938594 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wbgq5\" (UniqueName: \"kubernetes.io/projected/da61a168-341e-43e4-a7d8-7b24b79c346b-kube-api-access-wbgq5\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938607 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-system-cni-dir\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-kubernetes\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-run\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938662 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysconfig\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-os-release\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938701 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20f9c88a-aaca-401a-b81d-d9a32b00d92a-cni-binary-copy\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938725 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-socket-dir-parent\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938727 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-socket-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938752 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cnibin\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938751 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938777 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-system-cni-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938793 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-run\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysctl-d\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938843 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-kubernetes\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.938995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938846 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cnibin\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938856 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cni-binary-copy\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938772 2578 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938882 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-system-cni-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938889 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938895 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qnk79\" (UniqueName: \"kubernetes.io/projected/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-kube-api-access-qnk79\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d2d20229-b414-4e83-be51-e3c0fd756697-iptables-alerter-script\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de390571-ad59-463e-84ce-017e582c71b4-serviceca\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938980 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysctl-d\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938840 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-socket-dir-parent\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.938980 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hxxrj\" (UniqueName: \"kubernetes.io/projected/0a26fb71-5407-413c-a14c-18f3085f4abf-kube-api-access-hxxrj\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939034 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-env-overrides\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-hostroot\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939085 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysctl-conf\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-os-release\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de390571-ad59-463e-84ce-017e582c71b4-host\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939160 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-twlf5\" (UniqueName: \"kubernetes.io/projected/de390571-ad59-463e-84ce-017e582c71b4-kube-api-access-twlf5\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939193 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-cnibin\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.939794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939217 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-var-lib-kubelet\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-tuned\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939258 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/20f9c88a-aaca-401a-b81d-d9a32b00d92a-cni-binary-copy\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939266 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-etc-selinux\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939292 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/215c8235-1207-4971-9cc3-8c7aaa57988c-konnectivity-ca\") pod \"konnectivity-agent-f7m86\" (UID: \"215c8235-1207-4971-9cc3-8c7aaa57988c\") " pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939345 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-os-release\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939358 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939382 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cni-binary-copy\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939386 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-conf-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-conf-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-slash\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939457 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-systemd\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939468 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-cnibin\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/de390571-ad59-463e-84ce-017e582c71b4-serviceca\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939486 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939516 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-var-lib-kubelet\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939519 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-systemd\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/d2d20229-b414-4e83-be51-e3c0fd756697-iptables-alerter-script\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.940630 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939553 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2d20229-b414-4e83-be51-e3c0fd756697-host-slash\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939563 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-slash\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-systemd\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939584 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-cni-bin\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939608 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-run-systemd\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939642 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-node-log\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939609 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-node-log\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939675 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-log-socket\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939680 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-sysctl-conf\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939715 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-log-socket\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939741 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-os-release\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939748 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-hostroot\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939773 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/de390571-ad59-463e-84ce-017e582c71b4-host\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovnkube-script-lib\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-k8s-cni-cncf-io\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939848 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-pfzvk\" (UniqueName: \"kubernetes.io/projected/aee73a14-6669-4d65-8987-69628270ae6d-kube-api-access-pfzvk\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939874 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-cni-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-systemd-units\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.941447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939926 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-var-lib-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939924 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-env-overrides\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939950 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovnkube-config\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939953 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/d2d20229-b414-4e83-be51-e3c0fd756697-host-slash\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939527 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-kubelet-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.939976 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-sys-fs\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940002 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-sys\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940019 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-host\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940018 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-k8s-cni-cncf-io\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940040 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-multus-certs\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940066 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a26fb71-5407-413c-a14c-18f3085f4abf-tmp-dir\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940074 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-etc-selinux\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940093 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-run-netns\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-cni-multus\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940133 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-systemd-units\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940172 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-sys-fs\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940175 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-kubelet\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940213 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-multus-certs\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940222 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovnkube-script-lib\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940253 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-var-lib-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940289 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-cni-dir\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-run-netns\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940137 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-kubelet\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940355 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-sys\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940354 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940401 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940425 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/0a26fb71-5407-413c-a14c-18f3085f4abf-tmp-dir\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a26fb71-5407-413c-a14c-18f3085f4abf-hosts-file\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940459 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-kubelet\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940485 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-cni-bin\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940494 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-cni-multus\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940510 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-device-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940536 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940563 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rq7fq\" (UniqueName: \"kubernetes.io/projected/d2d20229-b414-4e83-be51-e3c0fd756697-kube-api-access-rq7fq\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940588 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-etc-kubernetes\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.942986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940612 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-var-lib-cni-bin\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940617 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-etc-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940646 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovn-node-metrics-cert\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940659 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-cni-bin\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940671 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-lib-modules\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940693 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f99e885-249c-4aad-bcc4-6ad66292dd2f-tmp\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940715 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-clq2j\" (UniqueName: \"kubernetes.io/projected/0f99e885-249c-4aad-bcc4-6ad66292dd2f-kube-api-access-clq2j\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-netns\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940739 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovnkube-config\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940781 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-tuning-conf-dir\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940782 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-etc-openvswitch\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940803 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-daemon-config\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-cni-netd\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/215c8235-1207-4971-9cc3-8c7aaa57988c-konnectivity-ca\") pod \"konnectivity-agent-f7m86\" (UID: \"215c8235-1207-4971-9cc3-8c7aaa57988c\") " pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940886 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/0a26fb71-5407-413c-a14c-18f3085f4abf-hosts-file\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-cni-netd\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940925 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/5678ae75-291c-4f06-82ee-c0d558cb29dc-host-kubelet\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-host\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.943604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.940963 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/da61a168-341e-43e4-a7d8-7b24b79c346b-device-dir\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.941143 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-etc-kubernetes\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.941235 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.941280 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/20f9c88a-aaca-401a-b81d-d9a32b00d92a-host-run-netns\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.941316 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:02.441278896 +0000 UTC m=+3.116082162 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.941342 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/0f99e885-249c-4aad-bcc4-6ad66292dd2f-lib-modules\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.941589 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/20f9c88a-aaca-401a-b81d-d9a32b00d92a-multus-daemon-config\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.941723 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.942515 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/0f99e885-249c-4aad-bcc4-6ad66292dd2f-etc-tuned\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.942806 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/215c8235-1207-4971-9cc3-8c7aaa57988c-agent-certs\") pod \"konnectivity-agent-f7m86\" (UID: \"215c8235-1207-4971-9cc3-8c7aaa57988c\") " pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.943366 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/0f99e885-249c-4aad-bcc4-6ad66292dd2f-tmp\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.944081 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.943883 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/5678ae75-291c-4f06-82ee-c0d558cb29dc-ovn-node-metrics-cert\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.951854 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.951666 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:01.951854 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.951688 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:01.951854 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.951698 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:01.951854 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:01.951751 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:02.45173379 +0000 UTC m=+3.126537032 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:01.952696 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.952670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qnk79\" (UniqueName: \"kubernetes.io/projected/0e41ada6-0c6f-4380-8fc5-ac4005a2c30b-kube-api-access-qnk79\") pod \"multus-additional-cni-plugins-z67br\" (UID: \"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b\") " pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:01.953800 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.953771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wbgq5\" (UniqueName: \"kubernetes.io/projected/da61a168-341e-43e4-a7d8-7b24b79c346b-kube-api-access-wbgq5\") pod \"aws-ebs-csi-driver-node-nq6k9\" (UID: \"da61a168-341e-43e4-a7d8-7b24b79c346b\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:01.954267 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.954241 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-pfzvk\" (UniqueName: \"kubernetes.io/projected/aee73a14-6669-4d65-8987-69628270ae6d-kube-api-access-pfzvk\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:01.954816 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.954796 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cxs6x\" (UniqueName: \"kubernetes.io/projected/20f9c88a-aaca-401a-b81d-d9a32b00d92a-kube-api-access-cxs6x\") pod \"multus-qnk5j\" (UID: \"20f9c88a-aaca-401a-b81d-d9a32b00d92a\") " pod="openshift-multus/multus-qnk5j" Apr 22 14:15:01.954888 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.954863 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4jrk9\" (UniqueName: \"kubernetes.io/projected/5678ae75-291c-4f06-82ee-c0d558cb29dc-kube-api-access-4jrk9\") pod \"ovnkube-node-2kpzl\" (UID: \"5678ae75-291c-4f06-82ee-c0d558cb29dc\") " pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:01.955056 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.955035 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hxxrj\" (UniqueName: \"kubernetes.io/projected/0a26fb71-5407-413c-a14c-18f3085f4abf-kube-api-access-hxxrj\") pod \"node-resolver-wbp94\" (UID: \"0a26fb71-5407-413c-a14c-18f3085f4abf\") " pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:01.955116 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.955089 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-twlf5\" (UniqueName: \"kubernetes.io/projected/de390571-ad59-463e-84ce-017e582c71b4-kube-api-access-twlf5\") pod \"node-ca-mnpw8\" (UID: \"de390571-ad59-463e-84ce-017e582c71b4\") " pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:01.955767 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.955744 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rq7fq\" (UniqueName: \"kubernetes.io/projected/d2d20229-b414-4e83-be51-e3c0fd756697-kube-api-access-rq7fq\") pod \"iptables-alerter-pcvj2\" (UID: \"d2d20229-b414-4e83-be51-e3c0fd756697\") " pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:01.956092 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.956076 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-clq2j\" (UniqueName: \"kubernetes.io/projected/0f99e885-249c-4aad-bcc4-6ad66292dd2f-kube-api-access-clq2j\") pod \"tuned-rkg84\" (UID: \"0f99e885-249c-4aad-bcc4-6ad66292dd2f\") " pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:01.970415 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:01.970395 2578 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:02.051386 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.051287 2578 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 22 14:15:02.119821 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.119790 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" Apr 22 14:15:02.126690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.126657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:02.135154 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.135134 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-rkg84" Apr 22 14:15:02.140794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.140776 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-mnpw8" Apr 22 14:15:02.146556 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.146534 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-qnk5j" Apr 22 14:15:02.155060 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.155038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-wbp94" Apr 22 14:15:02.160577 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.160552 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-z67br" Apr 22 14:15:02.169113 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.169092 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pcvj2" Apr 22 14:15:02.174858 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.174836 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:02.444596 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.444556 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:02.444791 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:02.444728 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:02.444859 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:02.444810 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:03.444787304 +0000 UTC m=+4.119590558 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:02.514510 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.514480 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0a26fb71_5407_413c_a14c_18f3085f4abf.slice/crio-75db0212b8937a05c04c6c22454ff08bee946cc1e33d21fd9f9d91d6f00ca997 WatchSource:0}: Error finding container 75db0212b8937a05c04c6c22454ff08bee946cc1e33d21fd9f9d91d6f00ca997: Status 404 returned error can't find the container with id 75db0212b8937a05c04c6c22454ff08bee946cc1e33d21fd9f9d91d6f00ca997 Apr 22 14:15:02.515733 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.515687 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0e41ada6_0c6f_4380_8fc5_ac4005a2c30b.slice/crio-c998d0cc034b5bace8b7ad34b80967ec605781b650e987c3168c075cd1df3985 WatchSource:0}: Error finding container c998d0cc034b5bace8b7ad34b80967ec605781b650e987c3168c075cd1df3985: Status 404 returned error can't find the container with id c998d0cc034b5bace8b7ad34b80967ec605781b650e987c3168c075cd1df3985 Apr 22 14:15:02.518147 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.518090 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde390571_ad59_463e_84ce_017e582c71b4.slice/crio-0c4e3d2e81aecb8cd30db626775f15e06e1dc10f44e9f9eed5782dcd7b8f796c WatchSource:0}: Error finding container 0c4e3d2e81aecb8cd30db626775f15e06e1dc10f44e9f9eed5782dcd7b8f796c: Status 404 returned error can't find the container with id 0c4e3d2e81aecb8cd30db626775f15e06e1dc10f44e9f9eed5782dcd7b8f796c Apr 22 14:15:02.518732 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.518706 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd2d20229_b414_4e83_be51_e3c0fd756697.slice/crio-99964316b394bc4bae928be72e36eacfcce00fb30a5b2921b4468e2bbc7620e2 WatchSource:0}: Error finding container 99964316b394bc4bae928be72e36eacfcce00fb30a5b2921b4468e2bbc7620e2: Status 404 returned error can't find the container with id 99964316b394bc4bae928be72e36eacfcce00fb30a5b2921b4468e2bbc7620e2 Apr 22 14:15:02.519766 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.519742 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podda61a168_341e_43e4_a7d8_7b24b79c346b.slice/crio-b3c69bbf8c702688fc58f5cc6f2e753a8bbc76d343234836c8ab0f279080033b WatchSource:0}: Error finding container b3c69bbf8c702688fc58f5cc6f2e753a8bbc76d343234836c8ab0f279080033b: Status 404 returned error can't find the container with id b3c69bbf8c702688fc58f5cc6f2e753a8bbc76d343234836c8ab0f279080033b Apr 22 14:15:02.522035 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.521941 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod215c8235_1207_4971_9cc3_8c7aaa57988c.slice/crio-cb2ed4fc6fd113a1854b3c9e4d9e20f17054e897afb0c83c37608ee5e6f7ac8d WatchSource:0}: Error finding container cb2ed4fc6fd113a1854b3c9e4d9e20f17054e897afb0c83c37608ee5e6f7ac8d: Status 404 returned error can't find the container with id cb2ed4fc6fd113a1854b3c9e4d9e20f17054e897afb0c83c37608ee5e6f7ac8d Apr 22 14:15:02.522801 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.522733 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod20f9c88a_aaca_401a_b81d_d9a32b00d92a.slice/crio-949246f2e4ae4231c19d10f0a42660963b292a59f6f895a02e9a775eb01e3718 WatchSource:0}: Error finding container 949246f2e4ae4231c19d10f0a42660963b292a59f6f895a02e9a775eb01e3718: Status 404 returned error can't find the container with id 949246f2e4ae4231c19d10f0a42660963b292a59f6f895a02e9a775eb01e3718 Apr 22 14:15:02.524368 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:02.524281 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f99e885_249c_4aad_bcc4_6ad66292dd2f.slice/crio-666ccb413ca81541bfb1a38efa877f146df3b11c8148caf7993f8a31862e3207 WatchSource:0}: Error finding container 666ccb413ca81541bfb1a38efa877f146df3b11c8148caf7993f8a31862e3207: Status 404 returned error can't find the container with id 666ccb413ca81541bfb1a38efa877f146df3b11c8148caf7993f8a31862e3207 Apr 22 14:15:02.545212 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.545043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:02.545287 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:02.545197 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:02.545338 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:02.545293 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:02.545338 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:02.545318 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:02.545408 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:02.545374 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:03.545359286 +0000 UTC m=+4.220162528 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:02.882841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.882786 2578 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-21 14:10:00 +0000 UTC" deadline="2028-02-02 17:05:56.00853093 +0000 UTC" Apr 22 14:15:02.882841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.882830 2578 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="15626h50m53.125705217s" Apr 22 14:15:02.956534 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.956460 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pcvj2" event={"ID":"d2d20229-b414-4e83-be51-e3c0fd756697","Type":"ContainerStarted","Data":"99964316b394bc4bae928be72e36eacfcce00fb30a5b2921b4468e2bbc7620e2"} Apr 22 14:15:02.963508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.963447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mnpw8" event={"ID":"de390571-ad59-463e-84ce-017e582c71b4","Type":"ContainerStarted","Data":"0c4e3d2e81aecb8cd30db626775f15e06e1dc10f44e9f9eed5782dcd7b8f796c"} Apr 22 14:15:02.971905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.971833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" event={"ID":"da61a168-341e-43e4-a7d8-7b24b79c346b","Type":"ContainerStarted","Data":"b3c69bbf8c702688fc58f5cc6f2e753a8bbc76d343234836c8ab0f279080033b"} Apr 22 14:15:02.980325 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:02.980254 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wbp94" event={"ID":"0a26fb71-5407-413c-a14c-18f3085f4abf","Type":"ContainerStarted","Data":"75db0212b8937a05c04c6c22454ff08bee946cc1e33d21fd9f9d91d6f00ca997"} Apr 22 14:15:03.001186 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.001147 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" event={"ID":"973f3092b03452d8285b08e0d93dce0b","Type":"ContainerStarted","Data":"686dc4deea015ff22a19d95dfe33dee7278b3892a7229d10d6bfd2f0f9b37062"} Apr 22 14:15:03.004053 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.003974 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"a7fd66fdd596606695e0886ad693207db182cfc109daf370b1a2a34cad8382e8"} Apr 22 14:15:03.008428 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.008368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rkg84" event={"ID":"0f99e885-249c-4aad-bcc4-6ad66292dd2f","Type":"ContainerStarted","Data":"666ccb413ca81541bfb1a38efa877f146df3b11c8148caf7993f8a31862e3207"} Apr 22 14:15:03.018937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.018857 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnk5j" event={"ID":"20f9c88a-aaca-401a-b81d-d9a32b00d92a","Type":"ContainerStarted","Data":"949246f2e4ae4231c19d10f0a42660963b292a59f6f895a02e9a775eb01e3718"} Apr 22 14:15:03.021268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.021235 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f7m86" event={"ID":"215c8235-1207-4971-9cc3-8c7aaa57988c","Type":"ContainerStarted","Data":"cb2ed4fc6fd113a1854b3c9e4d9e20f17054e897afb0c83c37608ee5e6f7ac8d"} Apr 22 14:15:03.041689 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.041624 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerStarted","Data":"c998d0cc034b5bace8b7ad34b80967ec605781b650e987c3168c075cd1df3985"} Apr 22 14:15:03.454419 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.454337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:03.454598 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.454486 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:03.454598 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.454552 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:05.454532166 +0000 UTC m=+6.129335415 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:03.555602 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.555567 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:03.555765 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.555739 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:03.555765 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.555760 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:03.555862 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.555774 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:03.555862 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.555832 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:05.555814297 +0000 UTC m=+6.230617550 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:03.948921 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.948486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:03.948921 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:03.948505 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:03.948921 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.948618 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:03.948921 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:03.948761 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:04.079399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:04.078789 2578 generic.go:358] "Generic (PLEG): container finished" podID="20438f0448f48ac5e6fdc84ad54ce303" containerID="cd2dea153437be1b57d73b47cb712175158329cb5fe25084fed4521e13ee61ef" exitCode=0 Apr 22 14:15:04.079399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:04.078941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" event={"ID":"20438f0448f48ac5e6fdc84ad54ce303","Type":"ContainerDied","Data":"cd2dea153437be1b57d73b47cb712175158329cb5fe25084fed4521e13ee61ef"} Apr 22 14:15:04.093889 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:04.093823 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-132-130.ec2.internal" podStartSLOduration=3.093804665 podStartE2EDuration="3.093804665s" podCreationTimestamp="2026-04-22 14:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:03.014978014 +0000 UTC m=+3.689781281" watchObservedRunningTime="2026-04-22 14:15:04.093804665 +0000 UTC m=+4.768607934" Apr 22 14:15:05.086570 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:05.086534 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" event={"ID":"20438f0448f48ac5e6fdc84ad54ce303","Type":"ContainerStarted","Data":"ec1011912828ea41c48fb1f5c5f3d954dae82a63380a26a36e124ab5753465be"} Apr 22 14:15:05.477416 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:05.476740 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:05.477416 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.476926 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:05.477416 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.476993 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:09.476972648 +0000 UTC m=+10.151775893 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:05.578113 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:05.578005 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:05.578325 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.578185 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:05.578325 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.578211 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:05.578325 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.578226 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:05.578325 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.578309 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:09.578275021 +0000 UTC m=+10.253078284 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:05.947141 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:05.947108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:05.947334 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.947222 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:05.947610 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:05.947502 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:05.947720 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:05.947629 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:07.946999 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:07.946961 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:07.947511 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:07.947107 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:07.947664 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:07.947643 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:07.947797 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:07.947772 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:09.513048 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:09.512986 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:09.513521 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.513143 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:09.513521 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.513209 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:17.513189381 +0000 UTC m=+18.187992626 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:09.613495 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:09.613409 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:09.613672 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.613585 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:09.613672 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.613609 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:09.613672 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.613622 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:09.613817 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.613679 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:17.613661835 +0000 UTC m=+18.288465083 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:09.948569 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:09.948528 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:09.948744 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.948650 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:09.949349 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:09.949188 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:09.949349 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:09.949313 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:11.947418 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:11.947382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:11.947876 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:11.947382 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:11.947876 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:11.947525 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:11.947876 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:11.947585 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:13.947655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:13.947616 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:13.948178 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:13.947626 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:13.948178 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:13.947767 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:13.948178 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:13.947860 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:15.946881 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:15.946844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:15.947365 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:15.946984 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:15.947365 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:15.947041 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:15.947365 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:15.947178 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:17.568519 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:17.568472 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:17.568935 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.568653 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:17.568935 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.568736 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.568715178 +0000 UTC m=+34.243518436 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:17.669855 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:17.669808 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:17.670048 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.670001 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:17.670048 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.670029 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:17.670048 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.670044 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:17.670187 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.670111 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.670094792 +0000 UTC m=+34.344898056 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:17.946810 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:17.946774 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:17.946985 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.946890 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:17.946985 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:17.946957 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:17.947115 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:17.947080 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:19.948679 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:19.948513 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:19.948679 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:19.948618 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:19.949485 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:19.949322 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:19.949485 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:19.949441 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:20.116122 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.115929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"5f5994a7e228339147f34660e43d17ee1a0745650507254b4800f45436a11a8b"} Apr 22 14:15:20.117277 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.117244 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-rkg84" event={"ID":"0f99e885-249c-4aad-bcc4-6ad66292dd2f","Type":"ContainerStarted","Data":"9ca2e8974ba50ec4859c4faab011e93a0fdd2cc3b79d2dc7dad37491e24806ef"} Apr 22 14:15:20.118792 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.118768 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-qnk5j" event={"ID":"20f9c88a-aaca-401a-b81d-d9a32b00d92a","Type":"ContainerStarted","Data":"ba3b819c2ef437ca9d532315199e9f8afb78f5f8f1dbd7c3542cf9fce741e0e1"} Apr 22 14:15:20.120045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.120018 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-f7m86" event={"ID":"215c8235-1207-4971-9cc3-8c7aaa57988c","Type":"ContainerStarted","Data":"eb91a4793f1684c4c1ca67261b6847f5ffc512b5c44843b72066793dea09b099"} Apr 22 14:15:20.121232 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.121207 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerStarted","Data":"2c1bd639f9c6da129bd57365c6d06b1445cf80808412795ae56f6087e1eb8ee0"} Apr 22 14:15:20.122534 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.122516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-mnpw8" event={"ID":"de390571-ad59-463e-84ce-017e582c71b4","Type":"ContainerStarted","Data":"5b4168ee9a0ef588320d31dd558f3414d5a6243a94a4d1c9a99c274f7458fdfe"} Apr 22 14:15:20.136349 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.136290 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-rkg84" podStartSLOduration=2.869627449 podStartE2EDuration="20.136276923s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.526558264 +0000 UTC m=+3.201361520" lastFinishedPulling="2026-04-22 14:15:19.793207736 +0000 UTC m=+20.468010994" observedRunningTime="2026-04-22 14:15:20.136144355 +0000 UTC m=+20.810947621" watchObservedRunningTime="2026-04-22 14:15:20.136276923 +0000 UTC m=+20.811080187" Apr 22 14:15:20.136568 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.136539 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-132-130.ec2.internal" podStartSLOduration=19.136534072 podStartE2EDuration="19.136534072s" podCreationTimestamp="2026-04-22 14:15:01 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:15:05.101231064 +0000 UTC m=+5.776034329" watchObservedRunningTime="2026-04-22 14:15:20.136534072 +0000 UTC m=+20.811337338" Apr 22 14:15:20.156257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.156209 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-f7m86" podStartSLOduration=4.264185597 podStartE2EDuration="21.156193303s" podCreationTimestamp="2026-04-22 14:14:59 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.524812532 +0000 UTC m=+3.199615779" lastFinishedPulling="2026-04-22 14:15:19.416820229 +0000 UTC m=+20.091623485" observedRunningTime="2026-04-22 14:15:20.155499568 +0000 UTC m=+20.830302834" watchObservedRunningTime="2026-04-22 14:15:20.156193303 +0000 UTC m=+20.830996568" Apr 22 14:15:20.172722 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:20.172678 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-mnpw8" podStartSLOduration=7.7433730050000005 podStartE2EDuration="20.172663257s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.520737778 +0000 UTC m=+3.195541028" lastFinishedPulling="2026-04-22 14:15:14.950028023 +0000 UTC m=+15.624831280" observedRunningTime="2026-04-22 14:15:20.17245647 +0000 UTC m=+20.847259745" watchObservedRunningTime="2026-04-22 14:15:20.172663257 +0000 UTC m=+20.847466521" Apr 22 14:15:21.125606 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.125362 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e41ada6-0c6f-4380-8fc5-ac4005a2c30b" containerID="2c1bd639f9c6da129bd57365c6d06b1445cf80808412795ae56f6087e1eb8ee0" exitCode=0 Apr 22 14:15:21.125606 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.125449 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerDied","Data":"2c1bd639f9c6da129bd57365c6d06b1445cf80808412795ae56f6087e1eb8ee0"} Apr 22 14:15:21.126969 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.126944 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" event={"ID":"da61a168-341e-43e4-a7d8-7b24b79c346b","Type":"ContainerStarted","Data":"3b8c745e1e6335ea57e7e04badcdd5a4a0b1255e541b4ab583e74982a4daa629"} Apr 22 14:15:21.128025 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.128002 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-wbp94" event={"ID":"0a26fb71-5407-413c-a14c-18f3085f4abf","Type":"ContainerStarted","Data":"ad7b627b8d37e557a81cd62566e7d8b92196927c3c3a9d140659244a59e86ffb"} Apr 22 14:15:21.130353 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130336 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:15:21.130604 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130588 2578 generic.go:358] "Generic (PLEG): container finished" podID="5678ae75-291c-4f06-82ee-c0d558cb29dc" containerID="9896e2bdfa5dd8dfe75715fdd4844861bf3560f4c5616e3bb931395b8d3386db" exitCode=1 Apr 22 14:15:21.130709 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130687 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"fae683cf2190958b3c265ed98dc2c39fa1596359199021008340018ce58b9fc2"} Apr 22 14:15:21.130776 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130719 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"24be7a8fac17773934682dee80cb60f316674792c9b4860c7226aa1c040bccee"} Apr 22 14:15:21.130776 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130733 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"b45b429ecd3623e587a927ab985809670d5657b6b4d3ca5512a3d43822a79608"} Apr 22 14:15:21.130776 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130746 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"39ff82ca791e389d04111fd219ddf358bc49e2f7103fd2bf9a4440020b42371e"} Apr 22 14:15:21.130776 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.130758 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerDied","Data":"9896e2bdfa5dd8dfe75715fdd4844861bf3560f4c5616e3bb931395b8d3386db"} Apr 22 14:15:21.169432 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.169386 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-qnk5j" podStartSLOduration=3.880424423 podStartE2EDuration="21.169372835s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.52486773 +0000 UTC m=+3.199670989" lastFinishedPulling="2026-04-22 14:15:19.813816144 +0000 UTC m=+20.488619401" observedRunningTime="2026-04-22 14:15:21.168915416 +0000 UTC m=+21.843718710" watchObservedRunningTime="2026-04-22 14:15:21.169372835 +0000 UTC m=+21.844176100" Apr 22 14:15:21.189538 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.189492 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-wbp94" podStartSLOduration=3.915443913 podStartE2EDuration="21.18947866s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.516508014 +0000 UTC m=+3.191311257" lastFinishedPulling="2026-04-22 14:15:19.790542751 +0000 UTC m=+20.465346004" observedRunningTime="2026-04-22 14:15:21.189366899 +0000 UTC m=+21.864170163" watchObservedRunningTime="2026-04-22 14:15:21.18947866 +0000 UTC m=+21.864281926" Apr 22 14:15:21.378686 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.378662 2578 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 22 14:15:21.890145 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.890040 2578 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-22T14:15:21.378679728Z","UUID":"ed12dac7-88fc-46dd-8a84-acd36c904151","Handler":null,"Name":"","Endpoint":""} Apr 22 14:15:21.891969 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.891947 2578 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 22 14:15:21.891969 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.891972 2578 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 22 14:15:21.947590 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.947403 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:21.947768 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:21.947403 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:21.947768 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:21.947696 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:21.947768 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:21.947749 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:22.133432 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:22.133397 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pcvj2" event={"ID":"d2d20229-b414-4e83-be51-e3c0fd756697","Type":"ContainerStarted","Data":"c9ecc30db2e7885b99101c9b5a35d1695408641d3c309de445cd92670b442626"} Apr 22 14:15:22.134870 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:22.134847 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" event={"ID":"da61a168-341e-43e4-a7d8-7b24b79c346b","Type":"ContainerStarted","Data":"55c129e66fb2874dd395387c9c5f295d8fc944a32eab3673755747121573b97e"} Apr 22 14:15:23.142660 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.142624 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:15:23.143505 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.143470 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"9f41025bf1d4b0cfcb16e9ea3cb123683818dbabd2d84eb9d62b6ca0078f068d"} Apr 22 14:15:23.145515 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.145488 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" event={"ID":"da61a168-341e-43e4-a7d8-7b24b79c346b","Type":"ContainerStarted","Data":"7e12598d52ffc9775098f6be3453d5d7ec5aaad507ebf10e6c5381b5b422be83"} Apr 22 14:15:23.166436 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.166377 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pcvj2" podStartSLOduration=6.270251955 podStartE2EDuration="23.166357396s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.52071145 +0000 UTC m=+3.195514692" lastFinishedPulling="2026-04-22 14:15:19.41681689 +0000 UTC m=+20.091620133" observedRunningTime="2026-04-22 14:15:22.151532338 +0000 UTC m=+22.826335602" watchObservedRunningTime="2026-04-22 14:15:23.166357396 +0000 UTC m=+23.841160662" Apr 22 14:15:23.166780 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.166747 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-nq6k9" podStartSLOduration=4.365845701 podStartE2EDuration="24.1667353s" podCreationTimestamp="2026-04-22 14:14:59 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.521720241 +0000 UTC m=+3.196523499" lastFinishedPulling="2026-04-22 14:15:22.322609855 +0000 UTC m=+22.997413098" observedRunningTime="2026-04-22 14:15:23.166233135 +0000 UTC m=+23.841036401" watchObservedRunningTime="2026-04-22 14:15:23.1667353 +0000 UTC m=+23.841538567" Apr 22 14:15:23.947033 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.946993 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:23.947033 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:23.947014 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:23.947251 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:23.947144 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:23.947326 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:23.947283 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:24.015756 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:24.015714 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:24.016489 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:24.016462 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:25.950318 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:25.950277 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:25.950853 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:25.950278 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:25.950853 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:25.950389 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:25.950853 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:25.950471 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:26.154814 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:26.154438 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:15:26.155271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:26.154957 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"d984f1f494a8fa2988b7994878da48778aa60fdaa486a8b0adac85dadc9502db"} Apr 22 14:15:26.155671 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:26.155352 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:26.155671 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:26.155391 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:26.156011 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:26.155678 2578 scope.go:117] "RemoveContainer" containerID="9896e2bdfa5dd8dfe75715fdd4844861bf3560f4c5616e3bb931395b8d3386db" Apr 22 14:15:26.172774 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:26.172593 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:27.158545 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.158503 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e41ada6-0c6f-4380-8fc5-ac4005a2c30b" containerID="59b157b89904e89a5dcd3677df3fec52a412be824393c057745d3f395b767116" exitCode=0 Apr 22 14:15:27.159109 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.158589 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerDied","Data":"59b157b89904e89a5dcd3677df3fec52a412be824393c057745d3f395b767116"} Apr 22 14:15:27.162126 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.162103 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:15:27.162412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.162389 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" event={"ID":"5678ae75-291c-4f06-82ee-c0d558cb29dc","Type":"ContainerStarted","Data":"b17c65a81be8349c74ee9383c631dc0d0fc8529bcbfae487ec2e8b414c4847fa"} Apr 22 14:15:27.162717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.162699 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:27.177823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.177794 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:15:27.207447 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.207398 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" podStartSLOduration=9.897621497 podStartE2EDuration="27.207380845s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.528495342 +0000 UTC m=+3.203298600" lastFinishedPulling="2026-04-22 14:15:19.838254701 +0000 UTC m=+20.513057948" observedRunningTime="2026-04-22 14:15:27.207170156 +0000 UTC m=+27.881973421" watchObservedRunningTime="2026-04-22 14:15:27.207380845 +0000 UTC m=+27.882184107" Apr 22 14:15:27.947393 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.947361 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:27.947537 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:27.947493 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:27.947537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:27.947532 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:27.947680 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:27.947641 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:28.166487 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:28.166277 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e41ada6-0c6f-4380-8fc5-ac4005a2c30b" containerID="b968b33d6b081e41ab7266e559e7419ebcba22c90cd67f4f91d56d9a66a34a64" exitCode=0 Apr 22 14:15:28.166808 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:28.166338 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerDied","Data":"b968b33d6b081e41ab7266e559e7419ebcba22c90cd67f4f91d56d9a66a34a64"} Apr 22 14:15:28.357425 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:28.357393 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9kmw7"] Apr 22 14:15:28.357562 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:28.357500 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:28.357601 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:28.357572 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:28.361118 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:28.361092 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8q2mm"] Apr 22 14:15:28.361225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:28.361198 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:28.361290 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:28.361273 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:29.949547 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:29.949514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:29.950218 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:29.949514 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:29.950218 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:29.949618 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:29.950218 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:29.949679 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:30.171236 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:30.171202 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e41ada6-0c6f-4380-8fc5-ac4005a2c30b" containerID="2e2b6d7c6728cb065be935e1263a1fcc87e4f25ae536d0920216e7fd6546c487" exitCode=0 Apr 22 14:15:30.171398 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:30.171256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerDied","Data":"2e2b6d7c6728cb065be935e1263a1fcc87e4f25ae536d0920216e7fd6546c487"} Apr 22 14:15:30.370392 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:30.370354 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:30.370536 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:30.370497 2578 prober_manager.go:312] "Failed to trigger a manual run" probe="Readiness" Apr 22 14:15:30.371139 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:30.371114 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-f7m86" Apr 22 14:15:31.947429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:31.947395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:31.947867 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:31.947436 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:31.947867 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:31.947518 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-9kmw7" podUID="177ed1a3-5a44-405b-a340-c5e0c5655232" Apr 22 14:15:31.947867 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:31.947635 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:15:33.129571 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.129538 2578 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-132-130.ec2.internal" event="NodeReady" Apr 22 14:15:33.130057 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.129695 2578 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 22 14:15:33.178988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.178957 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-kxldt"] Apr 22 14:15:33.194282 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.194248 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-xhmzs"] Apr 22 14:15:33.194469 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.194395 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.198174 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.197289 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 22 14:15:33.198174 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.197410 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hhqr7\"" Apr 22 14:15:33.198174 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.197688 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 22 14:15:33.202471 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.202447 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kxldt"] Apr 22 14:15:33.202585 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.202554 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.205537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.205514 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 22 14:15:33.205537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.205528 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qh95w\"" Apr 22 14:15:33.205537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.205533 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xhmzs"] Apr 22 14:15:33.205764 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.205751 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 22 14:15:33.205934 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.205918 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 22 14:15:33.279055 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.279013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-config-volume\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.279235 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.279068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-smlrv\" (UniqueName: \"kubernetes.io/projected/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-kube-api-access-smlrv\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.279235 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.279172 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-tmp-dir\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.279235 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.279213 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.279235 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.279233 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.279471 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.279272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gcdkl\" (UniqueName: \"kubernetes.io/projected/52ec1912-fa03-4d61-8364-c8cc1159fcb5-kube-api-access-gcdkl\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.380618 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380535 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-tmp-dir\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.380618 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380577 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.380618 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380595 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380627 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gcdkl\" (UniqueName: \"kubernetes.io/projected/52ec1912-fa03-4d61-8364-c8cc1159fcb5-kube-api-access-gcdkl\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380656 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-config-volume\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380683 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-smlrv\" (UniqueName: \"kubernetes.io/projected/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-kube-api-access-smlrv\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.380699 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.380749 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.380789 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.880764052 +0000 UTC m=+34.555567295 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:33.380853 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.380814 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:33.88079924 +0000 UTC m=+34.555602496 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:15:33.381098 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.380945 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-tmp-dir\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.381224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.381206 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-config-volume\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.391241 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.391064 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-smlrv\" (UniqueName: \"kubernetes.io/projected/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-kube-api-access-smlrv\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.391405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.391373 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gcdkl\" (UniqueName: \"kubernetes.io/projected/52ec1912-fa03-4d61-8364-c8cc1159fcb5-kube-api-access-gcdkl\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.582096 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.582055 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:33.582313 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.582171 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:33.582313 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.582224 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.582209699 +0000 UTC m=+66.257012941 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 22 14:15:33.682948 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.682829 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:33.683109 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.682997 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 22 14:15:33.683109 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.683019 2578 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 22 14:15:33.683109 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.683028 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rqvjg for pod openshift-network-diagnostics/network-check-target-9kmw7: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:33.683109 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.683091 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg podName:177ed1a3-5a44-405b-a340-c5e0c5655232 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:05.683069514 +0000 UTC m=+66.357872758 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "kube-api-access-rqvjg" (UniqueName: "kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg") pod "network-check-target-9kmw7" (UID: "177ed1a3-5a44-405b-a340-c5e0c5655232") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 22 14:15:33.884736 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.884698 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:33.884736 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.884739 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:33.884969 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.884856 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:33.884969 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.884867 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:33.884969 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.884923 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:34.884909426 +0000 UTC m=+35.559712673 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:33.884969 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:33.884938 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:34.884931492 +0000 UTC m=+35.559734734 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:15:33.947831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.947754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:15:33.948130 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.947754 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:15:33.950864 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.950666 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:15:33.950864 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.950736 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:15:33.950864 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.950760 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4nhsh\"" Apr 22 14:15:33.951075 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.951006 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-95blp\"" Apr 22 14:15:33.951219 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:33.951198 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:15:34.891761 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:34.891730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:34.891761 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:34.891766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:34.892257 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:34.891877 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:34.892257 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:34.891881 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:34.892257 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:34.891939 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:36.891924065 +0000 UTC m=+37.566727308 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:15:34.892257 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:34.891952 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:36.89194587 +0000 UTC m=+37.566749113 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:36.909236 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:36.909194 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:36.909756 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:36.909337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:36.909756 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:36.909365 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:36.909756 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:36.909446 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:40.909425419 +0000 UTC m=+41.584228671 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:15:36.909756 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:36.909453 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:36.909756 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:36.909509 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:40.909492722 +0000 UTC m=+41.584295982 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:37.187580 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:37.187546 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerStarted","Data":"4060f277746138ab9c8c007027458c66bb115c8a8ce1244b54e813d1b44b2f0b"} Apr 22 14:15:38.192428 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:38.192386 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e41ada6-0c6f-4380-8fc5-ac4005a2c30b" containerID="4060f277746138ab9c8c007027458c66bb115c8a8ce1244b54e813d1b44b2f0b" exitCode=0 Apr 22 14:15:38.192782 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:38.192448 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerDied","Data":"4060f277746138ab9c8c007027458c66bb115c8a8ce1244b54e813d1b44b2f0b"} Apr 22 14:15:39.196824 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:39.196786 2578 generic.go:358] "Generic (PLEG): container finished" podID="0e41ada6-0c6f-4380-8fc5-ac4005a2c30b" containerID="29118330a04500545d66e3b1408d104465d0848366d8a9b17952f2694a6e570c" exitCode=0 Apr 22 14:15:39.197211 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:39.196853 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerDied","Data":"29118330a04500545d66e3b1408d104465d0848366d8a9b17952f2694a6e570c"} Apr 22 14:15:40.202263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:40.202219 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-z67br" event={"ID":"0e41ada6-0c6f-4380-8fc5-ac4005a2c30b","Type":"ContainerStarted","Data":"4d55550803649d35249fd8a48ae720274b74b18e78b66a137f60552e107ed1b3"} Apr 22 14:15:40.226412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:40.226356 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-z67br" podStartSLOduration=5.666207632 podStartE2EDuration="40.226339435s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:15:02.51764792 +0000 UTC m=+3.192451163" lastFinishedPulling="2026-04-22 14:15:37.07777972 +0000 UTC m=+37.752582966" observedRunningTime="2026-04-22 14:15:40.224671559 +0000 UTC m=+40.899474838" watchObservedRunningTime="2026-04-22 14:15:40.226339435 +0000 UTC m=+40.901142699" Apr 22 14:15:40.938980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:40.938935 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:40.938980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:40.938984 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:40.939188 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:40.939071 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:40.939188 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:40.939077 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:40.939188 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:40.939122 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:48.939107689 +0000 UTC m=+49.613910937 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:15:40.939188 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:40.939141 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:15:48.939127476 +0000 UTC m=+49.613930719 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:46.633794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.633757 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588"] Apr 22 14:15:46.637662 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.637644 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.641820 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.641785 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-dockercfg-mrqtw\"" Apr 22 14:15:46.641820 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.641805 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"openshift-service-ca.crt\"" Apr 22 14:15:46.642019 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.641839 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"open-cluster-management-agent-addon\"/\"kube-root-ca.crt\"" Apr 22 14:15:46.642019 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.641786 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"managed-serviceaccount-hub-kubeconfig\"" Apr 22 14:15:46.642019 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.641784 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"open-cluster-management-agent-addon\"/\"open-cluster-management-image-pull-credentials\"" Apr 22 14:15:46.646075 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.646053 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588"] Apr 22 14:15:46.778894 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.778851 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e03df22-4c80-4e88-b0af-86bd967d1940-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-698579449c-lg588\" (UID: \"6e03df22-4c80-4e88-b0af-86bd967d1940\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.779087 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.778927 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qj9ll\" (UniqueName: \"kubernetes.io/projected/6e03df22-4c80-4e88-b0af-86bd967d1940-kube-api-access-qj9ll\") pod \"managed-serviceaccount-addon-agent-698579449c-lg588\" (UID: \"6e03df22-4c80-4e88-b0af-86bd967d1940\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.879649 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.879603 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qj9ll\" (UniqueName: \"kubernetes.io/projected/6e03df22-4c80-4e88-b0af-86bd967d1940-kube-api-access-qj9ll\") pod \"managed-serviceaccount-addon-agent-698579449c-lg588\" (UID: \"6e03df22-4c80-4e88-b0af-86bd967d1940\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.879918 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.879705 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e03df22-4c80-4e88-b0af-86bd967d1940-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-698579449c-lg588\" (UID: \"6e03df22-4c80-4e88-b0af-86bd967d1940\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.882933 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.882906 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hub-kubeconfig\" (UniqueName: \"kubernetes.io/secret/6e03df22-4c80-4e88-b0af-86bd967d1940-hub-kubeconfig\") pod \"managed-serviceaccount-addon-agent-698579449c-lg588\" (UID: \"6e03df22-4c80-4e88-b0af-86bd967d1940\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.888657 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.888596 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qj9ll\" (UniqueName: \"kubernetes.io/projected/6e03df22-4c80-4e88-b0af-86bd967d1940-kube-api-access-qj9ll\") pod \"managed-serviceaccount-addon-agent-698579449c-lg588\" (UID: \"6e03df22-4c80-4e88-b0af-86bd967d1940\") " pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:46.959899 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:46.959862 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" Apr 22 14:15:47.086232 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:47.086200 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588"] Apr 22 14:15:47.089892 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:15:47.089867 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod6e03df22_4c80_4e88_b0af_86bd967d1940.slice/crio-f53125e1f210da571a071862609b08908061daec19912d5805c3de19a4cfc46e WatchSource:0}: Error finding container f53125e1f210da571a071862609b08908061daec19912d5805c3de19a4cfc46e: Status 404 returned error can't find the container with id f53125e1f210da571a071862609b08908061daec19912d5805c3de19a4cfc46e Apr 22 14:15:47.218596 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:47.218502 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" event={"ID":"6e03df22-4c80-4e88-b0af-86bd967d1940","Type":"ContainerStarted","Data":"f53125e1f210da571a071862609b08908061daec19912d5805c3de19a4cfc46e"} Apr 22 14:15:48.995772 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:48.995730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:15:48.996241 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:48.995782 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:15:48.996241 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:48.995904 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:15:48.996241 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:48.995943 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:15:48.996241 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:48.995997 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:04.995974639 +0000 UTC m=+65.670777884 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:15:48.996241 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:15:48.996017 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:04.996007224 +0000 UTC m=+65.670810471 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:15:52.228140 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:52.228098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" event={"ID":"6e03df22-4c80-4e88-b0af-86bd967d1940","Type":"ContainerStarted","Data":"3acfb65ed2aa152fd24f5ac745dcb91a29e1058c41af6ccb8c89c18b774a6c1b"} Apr 22 14:15:52.244106 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:52.244059 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" podStartSLOduration=2.13894931 podStartE2EDuration="6.244044238s" podCreationTimestamp="2026-04-22 14:15:46 +0000 UTC" firstStartedPulling="2026-04-22 14:15:47.091752879 +0000 UTC m=+47.766556122" lastFinishedPulling="2026-04-22 14:15:51.196847805 +0000 UTC m=+51.871651050" observedRunningTime="2026-04-22 14:15:52.243686541 +0000 UTC m=+52.918489806" watchObservedRunningTime="2026-04-22 14:15:52.244044238 +0000 UTC m=+52.918847503" Apr 22 14:15:59.183155 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:15:59.183124 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-2kpzl" Apr 22 14:16:05.012946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.012898 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:16:05.013488 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.012991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:16:05.013488 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:05.013047 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:05.013488 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:05.013072 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:05.013488 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:05.013128 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:37.013105376 +0000 UTC m=+97.687908625 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:16:05.013488 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:05.013148 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:16:37.013139992 +0000 UTC m=+97.687943235 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:05.616685 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.616642 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:16:05.619360 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.619341 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 22 14:16:05.627633 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:05.627609 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:16:05.627687 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:05.627672 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:17:09.627655999 +0000 UTC m=+130.302459243 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : secret "metrics-daemon-secret" not found Apr 22 14:16:05.717057 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.717022 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:16:05.719743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.719726 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 22 14:16:05.730238 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.730209 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 22 14:16:05.740828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.740794 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rqvjg\" (UniqueName: \"kubernetes.io/projected/177ed1a3-5a44-405b-a340-c5e0c5655232-kube-api-access-rqvjg\") pod \"network-check-target-9kmw7\" (UID: \"177ed1a3-5a44-405b-a340-c5e0c5655232\") " pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:16:05.769275 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.769244 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-95blp\"" Apr 22 14:16:05.776507 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.776481 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:16:05.888045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:05.887980 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-9kmw7"] Apr 22 14:16:05.891652 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:16:05.891618 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod177ed1a3_5a44_405b_a340_c5e0c5655232.slice/crio-592f8f425996e1c0e597c5be4809dc9054367429a9def40df2cc5acab59cb32b WatchSource:0}: Error finding container 592f8f425996e1c0e597c5be4809dc9054367429a9def40df2cc5acab59cb32b: Status 404 returned error can't find the container with id 592f8f425996e1c0e597c5be4809dc9054367429a9def40df2cc5acab59cb32b Apr 22 14:16:06.254985 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:06.254898 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9kmw7" event={"ID":"177ed1a3-5a44-405b-a340-c5e0c5655232","Type":"ContainerStarted","Data":"592f8f425996e1c0e597c5be4809dc9054367429a9def40df2cc5acab59cb32b"} Apr 22 14:16:09.261590 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:09.261548 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-9kmw7" event={"ID":"177ed1a3-5a44-405b-a340-c5e0c5655232","Type":"ContainerStarted","Data":"22ac2247c2e6aa2c6a910e359fa7f18922d0cc8b1df07969b8cb05b13ee4be10"} Apr 22 14:16:09.261964 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:09.261717 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:16:09.277469 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:09.277418 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-9kmw7" podStartSLOduration=66.640699002 podStartE2EDuration="1m9.277402827s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:16:05.893471151 +0000 UTC m=+66.568274409" lastFinishedPulling="2026-04-22 14:16:08.530174987 +0000 UTC m=+69.204978234" observedRunningTime="2026-04-22 14:16:09.276608607 +0000 UTC m=+69.951411872" watchObservedRunningTime="2026-04-22 14:16:09.277402827 +0000 UTC m=+69.952206092" Apr 22 14:16:37.029203 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:37.029151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:16:37.029203 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:37.029200 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:16:37.029746 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:37.029314 2578 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 22 14:16:37.029746 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:37.029320 2578 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 22 14:16:37.029746 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:37.029376 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert podName:52ec1912-fa03-4d61-8364-c8cc1159fcb5 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.029360926 +0000 UTC m=+161.704164169 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert") pod "ingress-canary-xhmzs" (UID: "52ec1912-fa03-4d61-8364-c8cc1159fcb5") : secret "canary-serving-cert" not found Apr 22 14:16:37.029746 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:16:37.029389 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls podName:46042ad7-7ef3-4e0b-88e4-d9d9077a34d0 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:41.029383202 +0000 UTC m=+161.704186445 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls") pod "dns-default-kxldt" (UID: "46042ad7-7ef3-4e0b-88e4-d9d9077a34d0") : secret "dns-default-metrics-tls" not found Apr 22 14:16:40.266349 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:16:40.266313 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-9kmw7" Apr 22 14:17:09.657526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:09.657467 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:17:09.658017 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:09.657597 2578 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 22 14:17:09.658017 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:09.657660 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs podName:aee73a14-6669-4d65-8987-69628270ae6d nodeName:}" failed. No retries permitted until 2026-04-22 14:19:11.657645625 +0000 UTC m=+252.332448869 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs") pod "network-metrics-daemon-8q2mm" (UID: "aee73a14-6669-4d65-8987-69628270ae6d") : secret "metrics-daemon-secret" not found Apr 22 14:17:18.065250 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.065217 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-56d64c8c4-q4z5h"] Apr 22 14:17:18.068083 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.068063 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.070546 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070523 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 22 14:17:18.070674 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070525 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 22 14:17:18.070805 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070791 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 22 14:17:18.070860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070818 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9xqrs\"" Apr 22 14:17:18.070860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070818 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 22 14:17:18.070956 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070876 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 22 14:17:18.070956 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.070923 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 22 14:17:18.082780 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.082758 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56d64c8c4-q4z5h"] Apr 22 14:17:18.170659 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.170626 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj"] Apr 22 14:17:18.173214 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.173192 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-59b68764cb-dh9r7"] Apr 22 14:17:18.173373 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.173352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" Apr 22 14:17:18.175990 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.175968 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.182486 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.182466 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:18.182870 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.182852 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-pr4jm\"" Apr 22 14:17:18.184110 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.184093 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 22 14:17:18.184205 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.184121 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:18.184696 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.184680 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 22 14:17:18.185214 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.185195 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 22 14:17:18.185549 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.185475 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-bgh6b\"" Apr 22 14:17:18.195168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.195145 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 22 14:17:18.206516 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.206494 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj"] Apr 22 14:17:18.217727 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.217702 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-stats-auth\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.217871 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.217732 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qql8m\" (UniqueName: \"kubernetes.io/projected/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-kube-api-access-qql8m\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.217871 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.217818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.217871 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.217857 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-default-certificate\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.218020 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.217918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.242862 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.242823 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59b68764cb-dh9r7"] Apr 22 14:17:18.283616 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.283583 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527"] Apr 22 14:17:18.286432 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.286401 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q"] Apr 22 14:17:18.286559 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.286543 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.288816 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.288796 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.292496 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292464 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 22 14:17:18.292727 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292705 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"serving-cert\"" Apr 22 14:17:18.292727 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292724 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-dockercfg-pk44d\"" Apr 22 14:17:18.292936 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292710 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-hpbw6\"" Apr 22 14:17:18.292936 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292726 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 22 14:17:18.292936 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292761 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"service-ca-operator-config\"" Apr 22 14:17:18.293262 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.292975 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 22 14:17:18.297497 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.297472 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:17:18.298330 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.297772 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca-operator\"/\"kube-root-ca.crt\"" Apr 22 14:17:18.300721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.300151 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 22 14:17:18.303075 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.303053 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q"] Apr 22 14:17:18.304444 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.304424 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527"] Apr 22 14:17:18.318648 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-installation-pull-secrets\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.318648 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-bound-sa-token\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.318648 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-stats-auth\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.318828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318667 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qql8m\" (UniqueName: \"kubernetes.io/projected/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-kube-api-access-qql8m\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.318828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318689 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jz428\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-kube-api-access-jz428\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.318828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318709 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-image-registry-private-configuration\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.318828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318810 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.318945 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318834 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-x2n8p\" (UniqueName: \"kubernetes.io/projected/f204bbda-e891-46ec-b7ae-0fa017516505-kube-api-access-x2n8p\") pod \"volume-data-source-validator-7c6cbb6c87-bm8bj\" (UID: \"f204bbda-e891-46ec-b7ae-0fa017516505\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" Apr 22 14:17:18.318945 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318862 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.318945 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318902 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-trusted-ca\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.318945 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-default-certificate\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.319124 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.318958 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:18.319124 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.318965 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.319124 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.319018 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:18.81899801 +0000 UTC m=+139.493801258 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : secret "router-metrics-certs-default" not found Apr 22 14:17:18.319124 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.319048 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:18.819032851 +0000 UTC m=+139.493836105 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:18.319124 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.319092 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/774fe819-0129-443d-bc87-ddbe8c62267a-ca-trust-extracted\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.319124 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.319119 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-registry-certificates\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.321219 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.321189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-stats-auth\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.321353 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.321332 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-default-certificate\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.335267 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.335240 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qql8m\" (UniqueName: \"kubernetes.io/projected/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-kube-api-access-qql8m\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.420522 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420488 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/774fe819-0129-443d-bc87-ddbe8c62267a-ca-trust-extracted\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.420522 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420523 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-registry-certificates\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-installation-pull-secrets\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420571 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-bound-sa-token\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420606 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/78b16110-0b35-41a7-b840-f66b6fb4ac09-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420635 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jz428\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-kube-api-access-jz428\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d665af-9e8f-41f5-bc80-5b21a812d08d-config\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.420773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420754 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-image-registry-private-configuration\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420798 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420837 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-x2n8p\" (UniqueName: \"kubernetes.io/projected/f204bbda-e891-46ec-b7ae-0fa017516505-kube-api-access-x2n8p\") pod \"volume-data-source-validator-7c6cbb6c87-bm8bj\" (UID: \"f204bbda-e891-46ec-b7ae-0fa017516505\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-bfzlc\" (UniqueName: \"kubernetes.io/projected/e0d665af-9e8f-41f5-bc80-5b21a812d08d-kube-api-access-bfzlc\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420930 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-trusted-ca\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.420946 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xs8kn\" (UniqueName: \"kubernetes.io/projected/78b16110-0b35-41a7-b840-f66b6fb4ac09-kube-api-access-xs8kn\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.420971 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-59b68764cb-dh9r7: secret "image-registry-tls" not found Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.420997 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d665af-9e8f-41f5-bc80-5b21a812d08d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.421136 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.421045 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls podName:774fe819-0129-443d-bc87-ddbe8c62267a nodeName:}" failed. No retries permitted until 2026-04-22 14:17:18.921024537 +0000 UTC m=+139.595827782 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls") pod "image-registry-59b68764cb-dh9r7" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a") : secret "image-registry-tls" not found Apr 22 14:17:18.421520 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.421238 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-registry-certificates\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.421520 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.421324 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/774fe819-0129-443d-bc87-ddbe8c62267a-ca-trust-extracted\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.421924 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.421902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-trusted-ca\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.423365 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.423337 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-installation-pull-secrets\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.423477 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.423454 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-image-registry-private-configuration\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.432407 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.432379 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-bound-sa-token\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.432537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.432518 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-x2n8p\" (UniqueName: \"kubernetes.io/projected/f204bbda-e891-46ec-b7ae-0fa017516505-kube-api-access-x2n8p\") pod \"volume-data-source-validator-7c6cbb6c87-bm8bj\" (UID: \"f204bbda-e891-46ec-b7ae-0fa017516505\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" Apr 22 14:17:18.433073 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.433048 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jz428\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-kube-api-access-jz428\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.482386 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.482353 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" Apr 22 14:17:18.521765 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.521726 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/78b16110-0b35-41a7-b840-f66b6fb4ac09-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.521765 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.521772 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.522015 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.521842 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d665af-9e8f-41f5-bc80-5b21a812d08d-config\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.522015 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.521867 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:18.522015 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.521894 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-bfzlc\" (UniqueName: \"kubernetes.io/projected/e0d665af-9e8f-41f5-bc80-5b21a812d08d-kube-api-access-bfzlc\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.522015 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.521933 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls podName:78b16110-0b35-41a7-b840-f66b6fb4ac09 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:19.021913597 +0000 UTC m=+139.696716840 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4x527" (UID: "78b16110-0b35-41a7-b840-f66b6fb4ac09") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:18.522015 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.521967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xs8kn\" (UniqueName: \"kubernetes.io/projected/78b16110-0b35-41a7-b840-f66b6fb4ac09-kube-api-access-xs8kn\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.522015 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.521989 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d665af-9e8f-41f5-bc80-5b21a812d08d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.522525 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.522502 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/e0d665af-9e8f-41f5-bc80-5b21a812d08d-config\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.522700 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.522681 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/78b16110-0b35-41a7-b840-f66b6fb4ac09-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.524317 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.524274 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/e0d665af-9e8f-41f5-bc80-5b21a812d08d-serving-cert\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.531407 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.531374 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xs8kn\" (UniqueName: \"kubernetes.io/projected/78b16110-0b35-41a7-b840-f66b6fb4ac09-kube-api-access-xs8kn\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:18.531875 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.531855 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-bfzlc\" (UniqueName: \"kubernetes.io/projected/e0d665af-9e8f-41f5-bc80-5b21a812d08d-kube-api-access-bfzlc\") pod \"service-ca-operator-d6fc45fc5-88d4q\" (UID: \"e0d665af-9e8f-41f5-bc80-5b21a812d08d\") " pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.594465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.594385 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj"] Apr 22 14:17:18.597142 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:18.597104 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf204bbda_e891_46ec_b7ae_0fa017516505.slice/crio-b51dd176a2208cc19097c71bbeb2079fefe5c19a443ea2ef26db6c6f5c4ced1e WatchSource:0}: Error finding container b51dd176a2208cc19097c71bbeb2079fefe5c19a443ea2ef26db6c6f5c4ced1e: Status 404 returned error can't find the container with id b51dd176a2208cc19097c71bbeb2079fefe5c19a443ea2ef26db6c6f5c4ced1e Apr 22 14:17:18.604186 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.604160 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" Apr 22 14:17:18.715908 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.715877 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q"] Apr 22 14:17:18.718635 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:18.718610 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode0d665af_9e8f_41f5_bc80_5b21a812d08d.slice/crio-3369a73f1d7c467a4eba4fe68e2fc228f8dc538af08e97004ad4361d2c68df53 WatchSource:0}: Error finding container 3369a73f1d7c467a4eba4fe68e2fc228f8dc538af08e97004ad4361d2c68df53: Status 404 returned error can't find the container with id 3369a73f1d7c467a4eba4fe68e2fc228f8dc538af08e97004ad4361d2c68df53 Apr 22 14:17:18.825257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.825218 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.825442 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.825343 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:18.825442 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.825399 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:19.825381993 +0000 UTC m=+140.500185240 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:18.825524 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.825450 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:18.825524 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.825505 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:19.825490175 +0000 UTC m=+140.500293418 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : secret "router-metrics-certs-default" not found Apr 22 14:17:18.925770 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:18.925731 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:18.925937 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.925843 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:18.925937 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.925854 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-59b68764cb-dh9r7: secret "image-registry-tls" not found Apr 22 14:17:18.925937 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:18.925902 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls podName:774fe819-0129-443d-bc87-ddbe8c62267a nodeName:}" failed. No retries permitted until 2026-04-22 14:17:19.925888319 +0000 UTC m=+140.600691562 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls") pod "image-registry-59b68764cb-dh9r7" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a") : secret "image-registry-tls" not found Apr 22 14:17:19.026565 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:19.026513 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:19.026745 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.026685 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:19.026794 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.026760 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls podName:78b16110-0b35-41a7-b840-f66b6fb4ac09 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:20.026742784 +0000 UTC m=+140.701546032 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4x527" (UID: "78b16110-0b35-41a7-b840-f66b6fb4ac09") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:19.399107 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:19.399065 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" event={"ID":"e0d665af-9e8f-41f5-bc80-5b21a812d08d","Type":"ContainerStarted","Data":"3369a73f1d7c467a4eba4fe68e2fc228f8dc538af08e97004ad4361d2c68df53"} Apr 22 14:17:19.400205 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:19.400143 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" event={"ID":"f204bbda-e891-46ec-b7ae-0fa017516505","Type":"ContainerStarted","Data":"b51dd176a2208cc19097c71bbeb2079fefe5c19a443ea2ef26db6c6f5c4ced1e"} Apr 22 14:17:19.833647 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:19.833574 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:19.833647 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:19.833636 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:19.833835 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.833730 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:19.833835 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.833788 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:21.833773825 +0000 UTC m=+142.508577071 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : secret "router-metrics-certs-default" not found Apr 22 14:17:19.833835 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.833820 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:21.83380371 +0000 UTC m=+142.508606974 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:19.934781 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:19.934749 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:19.934937 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.934897 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:19.934937 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.934917 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-59b68764cb-dh9r7: secret "image-registry-tls" not found Apr 22 14:17:19.935019 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:19.934970 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls podName:774fe819-0129-443d-bc87-ddbe8c62267a nodeName:}" failed. No retries permitted until 2026-04-22 14:17:21.934954432 +0000 UTC m=+142.609757680 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls") pod "image-registry-59b68764cb-dh9r7" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a") : secret "image-registry-tls" not found Apr 22 14:17:20.036116 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:20.036073 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:20.036327 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:20.036219 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:20.036327 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:20.036309 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls podName:78b16110-0b35-41a7-b840-f66b6fb4ac09 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:22.036276371 +0000 UTC m=+142.711079627 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4x527" (UID: "78b16110-0b35-41a7-b840-f66b6fb4ac09") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:20.403804 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:20.403753 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" event={"ID":"f204bbda-e891-46ec-b7ae-0fa017516505","Type":"ContainerStarted","Data":"dd6ee11885ba5368d94361ece070235f8b84a91b41e63336b6d9b322556ddec3"} Apr 22 14:17:20.421495 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:20.421418 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-bm8bj" podStartSLOduration=1.174739856 podStartE2EDuration="2.421401352s" podCreationTimestamp="2026-04-22 14:17:18 +0000 UTC" firstStartedPulling="2026-04-22 14:17:18.599757219 +0000 UTC m=+139.274560462" lastFinishedPulling="2026-04-22 14:17:19.846418709 +0000 UTC m=+140.521221958" observedRunningTime="2026-04-22 14:17:20.41974894 +0000 UTC m=+141.094552206" watchObservedRunningTime="2026-04-22 14:17:20.421401352 +0000 UTC m=+141.096204618" Apr 22 14:17:21.406805 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:21.406766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" event={"ID":"e0d665af-9e8f-41f5-bc80-5b21a812d08d","Type":"ContainerStarted","Data":"4009fc62dbadd4119ff873a7bdc6a29b1ad6a3091eeccb110cbd3121bcc17c10"} Apr 22 14:17:21.425038 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:21.424984 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" podStartSLOduration=1.571659691 podStartE2EDuration="3.424970271s" podCreationTimestamp="2026-04-22 14:17:18 +0000 UTC" firstStartedPulling="2026-04-22 14:17:18.720397896 +0000 UTC m=+139.395201139" lastFinishedPulling="2026-04-22 14:17:20.573708473 +0000 UTC m=+141.248511719" observedRunningTime="2026-04-22 14:17:21.424498022 +0000 UTC m=+142.099301286" watchObservedRunningTime="2026-04-22 14:17:21.424970271 +0000 UTC m=+142.099773535" Apr 22 14:17:21.852050 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:21.851959 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:21.852209 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:21.852062 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:21.852209 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:21.852147 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:21.852209 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:21.852150 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:25.852128827 +0000 UTC m=+146.526932071 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:21.852209 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:21.852194 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:25.852182562 +0000 UTC m=+146.526985805 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : secret "router-metrics-certs-default" not found Apr 22 14:17:21.952486 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:21.952452 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:21.952633 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:21.952583 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:21.952633 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:21.952600 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-59b68764cb-dh9r7: secret "image-registry-tls" not found Apr 22 14:17:21.952729 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:21.952650 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls podName:774fe819-0129-443d-bc87-ddbe8c62267a nodeName:}" failed. No retries permitted until 2026-04-22 14:17:25.952635395 +0000 UTC m=+146.627438638 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls") pod "image-registry-59b68764cb-dh9r7" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a") : secret "image-registry-tls" not found Apr 22 14:17:22.052929 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:22.052889 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:22.053101 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:22.053030 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:22.053101 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:22.053100 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls podName:78b16110-0b35-41a7-b840-f66b6fb4ac09 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:26.05308308 +0000 UTC m=+146.727886328 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4x527" (UID: "78b16110-0b35-41a7-b840-f66b6fb4ac09") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:25.882952 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:25.882918 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:25.883371 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:25.882978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:25.883371 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:25.883069 2578 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 22 14:17:25.883371 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:25.883137 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:33.883121704 +0000 UTC m=+154.557924947 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : secret "router-metrics-certs-default" not found Apr 22 14:17:25.883371 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:25.883151 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle podName:817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:33.883144343 +0000 UTC m=+154.557947586 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle") pod "router-default-56d64c8c4-q4z5h" (UID: "817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6") : configmap references non-existent config key: service-ca.crt Apr 22 14:17:25.984041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:25.984006 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:25.984206 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:25.984148 2578 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 22 14:17:25.984206 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:25.984166 2578 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-59b68764cb-dh9r7: secret "image-registry-tls" not found Apr 22 14:17:25.984434 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:25.984218 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls podName:774fe819-0129-443d-bc87-ddbe8c62267a nodeName:}" failed. No retries permitted until 2026-04-22 14:17:33.984203211 +0000 UTC m=+154.659006454 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls") pod "image-registry-59b68764cb-dh9r7" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a") : secret "image-registry-tls" not found Apr 22 14:17:26.085406 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:26.085371 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:26.085528 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:26.085485 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:26.085565 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:26.085543 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls podName:78b16110-0b35-41a7-b840-f66b6fb4ac09 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:34.085526672 +0000 UTC m=+154.760329932 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4x527" (UID: "78b16110-0b35-41a7-b840-f66b6fb4ac09") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:26.412623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:26.412588 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wbp94_0a26fb71-5407-413c-a14c-18f3085f4abf/dns-node-resolver/0.log" Apr 22 14:17:27.410065 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:27.410030 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mnpw8_de390571-ad59-463e-84ce-017e582c71b4/node-ca/0.log" Apr 22 14:17:33.946182 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:33.946127 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:33.946182 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:33.946195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:33.946814 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:33.946792 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-service-ca-bundle\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:33.948574 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:33.948538 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6-metrics-certs\") pod \"router-default-56d64c8c4-q4z5h\" (UID: \"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6\") " pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:33.977246 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:33.977213 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:34.047732 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.047443 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:34.050254 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.050227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"image-registry-59b68764cb-dh9r7\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:34.088311 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.088257 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:34.096409 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.096379 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-56d64c8c4-q4z5h"] Apr 22 14:17:34.099148 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:34.099110 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod817ce9fb_bd94_4e0f_bbbc_cc50e1ef3bf6.slice/crio-9ae28dbc4cc2f8c23a5e841c34c71961caffc2385e9c5234dbf0f396e33723c9 WatchSource:0}: Error finding container 9ae28dbc4cc2f8c23a5e841c34c71961caffc2385e9c5234dbf0f396e33723c9: Status 404 returned error can't find the container with id 9ae28dbc4cc2f8c23a5e841c34c71961caffc2385e9c5234dbf0f396e33723c9 Apr 22 14:17:34.148529 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.148492 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:34.148710 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:34.148667 2578 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:34.148827 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:34.148730 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls podName:78b16110-0b35-41a7-b840-f66b6fb4ac09 nodeName:}" failed. No retries permitted until 2026-04-22 14:17:50.14871463 +0000 UTC m=+170.823517876 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-4x527" (UID: "78b16110-0b35-41a7-b840-f66b6fb4ac09") : secret "cluster-monitoring-operator-tls" not found Apr 22 14:17:34.215168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.215095 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-59b68764cb-dh9r7"] Apr 22 14:17:34.217965 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:34.217936 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod774fe819_0129_443d_bc87_ddbe8c62267a.slice/crio-6e48419fe1a95c006434857505a589c599d25098f8c5896e863b2e15a2b533bd WatchSource:0}: Error finding container 6e48419fe1a95c006434857505a589c599d25098f8c5896e863b2e15a2b533bd: Status 404 returned error can't find the container with id 6e48419fe1a95c006434857505a589c599d25098f8c5896e863b2e15a2b533bd Apr 22 14:17:34.434470 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.434424 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" event={"ID":"774fe819-0129-443d-bc87-ddbe8c62267a","Type":"ContainerStarted","Data":"cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e"} Apr 22 14:17:34.434470 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.434472 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" event={"ID":"774fe819-0129-443d-bc87-ddbe8c62267a","Type":"ContainerStarted","Data":"6e48419fe1a95c006434857505a589c599d25098f8c5896e863b2e15a2b533bd"} Apr 22 14:17:34.434699 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.434523 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:17:34.435780 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.435757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" event={"ID":"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6","Type":"ContainerStarted","Data":"0d0616633af0f77ad89743a8467c7888d9d6ad85d7a9263d5401f0c7a1a8d81a"} Apr 22 14:17:34.435780 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.435781 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" event={"ID":"817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6","Type":"ContainerStarted","Data":"9ae28dbc4cc2f8c23a5e841c34c71961caffc2385e9c5234dbf0f396e33723c9"} Apr 22 14:17:34.459042 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.458982 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" podStartSLOduration=16.458965743 podStartE2EDuration="16.458965743s" podCreationTimestamp="2026-04-22 14:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:34.457799965 +0000 UTC m=+155.132603231" watchObservedRunningTime="2026-04-22 14:17:34.458965743 +0000 UTC m=+155.133769002" Apr 22 14:17:34.480184 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.480065 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" podStartSLOduration=16.480044979 podStartE2EDuration="16.480044979s" podCreationTimestamp="2026-04-22 14:17:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:17:34.47957262 +0000 UTC m=+155.154375886" watchObservedRunningTime="2026-04-22 14:17:34.480044979 +0000 UTC m=+155.154848245" Apr 22 14:17:34.978260 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.978223 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:34.980759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:34.980735 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:35.438940 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:35.438906 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:35.440086 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:35.440064 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-56d64c8c4-q4z5h" Apr 22 14:17:36.209864 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:36.209818 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-kxldt" podUID="46042ad7-7ef3-4e0b-88e4-d9d9077a34d0" Apr 22 14:17:36.215967 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:36.215931 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-xhmzs" podUID="52ec1912-fa03-4d61-8364-c8cc1159fcb5" Apr 22 14:17:36.441570 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:36.441539 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:17:36.441570 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:36.441558 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kxldt" Apr 22 14:17:36.960415 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:17:36.960369 2578 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-8q2mm" podUID="aee73a14-6669-4d65-8987-69628270ae6d" Apr 22 14:17:41.099191 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.099082 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:17:41.099191 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.099142 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:17:41.101587 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.101556 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/46042ad7-7ef3-4e0b-88e4-d9d9077a34d0-metrics-tls\") pod \"dns-default-kxldt\" (UID: \"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0\") " pod="openshift-dns/dns-default-kxldt" Apr 22 14:17:41.101697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.101588 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/52ec1912-fa03-4d61-8364-c8cc1159fcb5-cert\") pod \"ingress-canary-xhmzs\" (UID: \"52ec1912-fa03-4d61-8364-c8cc1159fcb5\") " pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:17:41.246654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.246611 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-hhqr7\"" Apr 22 14:17:41.246654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.246618 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-qh95w\"" Apr 22 14:17:41.253612 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.253590 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-kxldt" Apr 22 14:17:41.253678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.253664 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-xhmzs" Apr 22 14:17:41.385280 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.385250 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-kxldt"] Apr 22 14:17:41.388199 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:41.388166 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod46042ad7_7ef3_4e0b_88e4_d9d9077a34d0.slice/crio-a3d1e5275965bdd625c23f516df638979d3648528db1b1cc74ebb422d0b740e7 WatchSource:0}: Error finding container a3d1e5275965bdd625c23f516df638979d3648528db1b1cc74ebb422d0b740e7: Status 404 returned error can't find the container with id a3d1e5275965bdd625c23f516df638979d3648528db1b1cc74ebb422d0b740e7 Apr 22 14:17:41.404270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.404161 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-xhmzs"] Apr 22 14:17:41.406514 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:41.406489 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod52ec1912_fa03_4d61_8364_c8cc1159fcb5.slice/crio-7a75a07aca00a88e85a88b5da4b7bcd827114ba02755c1e4db46ddcb0f8a6463 WatchSource:0}: Error finding container 7a75a07aca00a88e85a88b5da4b7bcd827114ba02755c1e4db46ddcb0f8a6463: Status 404 returned error can't find the container with id 7a75a07aca00a88e85a88b5da4b7bcd827114ba02755c1e4db46ddcb0f8a6463 Apr 22 14:17:41.454002 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.453953 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kxldt" event={"ID":"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0","Type":"ContainerStarted","Data":"a3d1e5275965bdd625c23f516df638979d3648528db1b1cc74ebb422d0b740e7"} Apr 22 14:17:41.454887 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:41.454859 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xhmzs" event={"ID":"52ec1912-fa03-4d61-8364-c8cc1159fcb5","Type":"ContainerStarted","Data":"7a75a07aca00a88e85a88b5da4b7bcd827114ba02755c1e4db46ddcb0f8a6463"} Apr 22 14:17:43.462964 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:43.462926 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kxldt" event={"ID":"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0","Type":"ContainerStarted","Data":"e53e06133b82bf0ad1f44ecf4e8b0584510ad3cd760e8bbf8114aa32514d0c91"} Apr 22 14:17:43.464436 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:43.464412 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-xhmzs" event={"ID":"52ec1912-fa03-4d61-8364-c8cc1159fcb5","Type":"ContainerStarted","Data":"66ea2d57ee5f03bbb01e0e370986a12534a1686cf2a34989aa4bccd4f7be2833"} Apr 22 14:17:43.481591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:43.481535 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-xhmzs" podStartSLOduration=128.635548749 podStartE2EDuration="2m10.481514527s" podCreationTimestamp="2026-04-22 14:15:33 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.408179414 +0000 UTC m=+162.082982657" lastFinishedPulling="2026-04-22 14:17:43.254145188 +0000 UTC m=+163.928948435" observedRunningTime="2026-04-22 14:17:43.481347157 +0000 UTC m=+164.156150425" watchObservedRunningTime="2026-04-22 14:17:43.481514527 +0000 UTC m=+164.156317792" Apr 22 14:17:44.472476 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.472437 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-kxldt" event={"ID":"46042ad7-7ef3-4e0b-88e4-d9d9077a34d0","Type":"ContainerStarted","Data":"43e64a0bfb177cc0d256bc4f6b3987bb0779f98cd61137752d294a61585a4d5a"} Apr 22 14:17:44.534002 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.533947 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-kxldt" podStartSLOduration=129.672486526 podStartE2EDuration="2m11.533931789s" podCreationTimestamp="2026-04-22 14:15:33 +0000 UTC" firstStartedPulling="2026-04-22 14:17:41.389880341 +0000 UTC m=+162.064683584" lastFinishedPulling="2026-04-22 14:17:43.251325603 +0000 UTC m=+163.926128847" observedRunningTime="2026-04-22 14:17:44.533135454 +0000 UTC m=+165.207938716" watchObservedRunningTime="2026-04-22 14:17:44.533931789 +0000 UTC m=+165.208735053" Apr 22 14:17:44.692601 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.692569 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-59b68764cb-dh9r7"] Apr 22 14:17:44.740744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.740712 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-kfnwb"] Apr 22 14:17:44.743803 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.743783 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-dtswc"] Apr 22 14:17:44.743969 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.743948 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:17:44.746775 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.746756 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.748485 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.748465 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 22 14:17:44.748983 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.748963 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-rjzvv\"" Apr 22 14:17:44.751274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.751256 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 22 14:17:44.751575 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.751557 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 22 14:17:44.765311 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.765276 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 22 14:17:44.765658 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.765644 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-4zp5l\"" Apr 22 14:17:44.766521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.766490 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 22 14:17:44.766651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.766634 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 22 14:17:44.791074 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.791045 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kfnwb"] Apr 22 14:17:44.791861 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.791838 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dtswc"] Apr 22 14:17:44.829521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.829489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-crio-socket\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.829521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.829518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-q46xq\" (UniqueName: \"kubernetes.io/projected/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-kube-api-access-q46xq\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.829717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.829539 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-data-volume\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.829717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.829554 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.829717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.829608 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-qzq9v\" (UniqueName: \"kubernetes.io/projected/8fdb260a-3bb8-4141-8359-e18230a3d1ee-kube-api-access-qzq9v\") pod \"downloads-6bcc868b7-kfnwb\" (UID: \"8fdb260a-3bb8-4141-8359-e18230a3d1ee\") " pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:17:44.829717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.829657 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930190 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-crio-socket\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-q46xq\" (UniqueName: \"kubernetes.io/projected/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-kube-api-access-q46xq\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930243 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-data-volume\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930260 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930280 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-qzq9v\" (UniqueName: \"kubernetes.io/projected/8fdb260a-3bb8-4141-8359-e18230a3d1ee-kube-api-access-qzq9v\") pod \"downloads-6bcc868b7-kfnwb\" (UID: \"8fdb260a-3bb8-4141-8359-e18230a3d1ee\") " pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:17:44.930498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930339 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-crio-socket\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930392 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-data-volume\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.930942 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.930893 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.932609 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.932585 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.947352 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.947328 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-q46xq\" (UniqueName: \"kubernetes.io/projected/f5aeece0-3eea-4cf5-8d59-cf4520bff33c-kube-api-access-q46xq\") pod \"insights-runtime-extractor-dtswc\" (UID: \"f5aeece0-3eea-4cf5-8d59-cf4520bff33c\") " pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:44.948137 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:44.948120 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-qzq9v\" (UniqueName: \"kubernetes.io/projected/8fdb260a-3bb8-4141-8359-e18230a3d1ee-kube-api-access-qzq9v\") pod \"downloads-6bcc868b7-kfnwb\" (UID: \"8fdb260a-3bb8-4141-8359-e18230a3d1ee\") " pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:17:45.054968 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.054882 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:17:45.059692 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.059657 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-dtswc" Apr 22 14:17:45.227171 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:45.227129 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8fdb260a_3bb8_4141_8359_e18230a3d1ee.slice/crio-432bf1051ba3dac5c8aeb2c81a3a21f13b69e6df9ed90200a9bc9df7fa0b2f45 WatchSource:0}: Error finding container 432bf1051ba3dac5c8aeb2c81a3a21f13b69e6df9ed90200a9bc9df7fa0b2f45: Status 404 returned error can't find the container with id 432bf1051ba3dac5c8aeb2c81a3a21f13b69e6df9ed90200a9bc9df7fa0b2f45 Apr 22 14:17:45.232955 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.232931 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-kfnwb"] Apr 22 14:17:45.237925 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.237903 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-dtswc"] Apr 22 14:17:45.475725 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.475685 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dtswc" event={"ID":"f5aeece0-3eea-4cf5-8d59-cf4520bff33c","Type":"ContainerStarted","Data":"39392129f5fe34b5f7d0ced6c4d3321cfe1e8a06b8604a88ad68dbd401acf411"} Apr 22 14:17:45.475725 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.475727 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dtswc" event={"ID":"f5aeece0-3eea-4cf5-8d59-cf4520bff33c","Type":"ContainerStarted","Data":"b59196f7aefba3b146b94ed0f1cd7ce714d9c3ce6493b425d3eec3de7c54606c"} Apr 22 14:17:45.476883 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.476858 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kfnwb" event={"ID":"8fdb260a-3bb8-4141-8359-e18230a3d1ee","Type":"ContainerStarted","Data":"432bf1051ba3dac5c8aeb2c81a3a21f13b69e6df9ed90200a9bc9df7fa0b2f45"} Apr 22 14:17:45.477072 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:45.477056 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-kxldt" Apr 22 14:17:46.482371 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:46.482328 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dtswc" event={"ID":"f5aeece0-3eea-4cf5-8d59-cf4520bff33c","Type":"ContainerStarted","Data":"14392c1cc4f0cc8a6cd2fb0dcdb996467397dcd4f728e53058fb1ff577173000"} Apr 22 14:17:48.492998 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:48.492962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-dtswc" event={"ID":"f5aeece0-3eea-4cf5-8d59-cf4520bff33c","Type":"ContainerStarted","Data":"a42de8c414491b4e5bf80cc3388150c8d828bcf7ca6f2d1ce1c095d97b7a9020"} Apr 22 14:17:48.519968 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:48.519914 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-dtswc" podStartSLOduration=2.354018604 podStartE2EDuration="4.519900263s" podCreationTimestamp="2026-04-22 14:17:44 +0000 UTC" firstStartedPulling="2026-04-22 14:17:45.285574981 +0000 UTC m=+165.960378224" lastFinishedPulling="2026-04-22 14:17:47.451456625 +0000 UTC m=+168.126259883" observedRunningTime="2026-04-22 14:17:48.518408739 +0000 UTC m=+169.193212004" watchObservedRunningTime="2026-04-22 14:17:48.519900263 +0000 UTC m=+169.194703527" Apr 22 14:17:48.947003 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:48.946967 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:17:50.173405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:50.173362 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:50.176070 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:50.176038 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/78b16110-0b35-41a7-b840-f66b6fb4ac09-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-4x527\" (UID: \"78b16110-0b35-41a7-b840-f66b6fb4ac09\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:50.396118 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:50.396054 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" Apr 22 14:17:50.534336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:50.534141 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527"] Apr 22 14:17:50.537653 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:50.537604 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod78b16110_0b35_41a7_b840_f66b6fb4ac09.slice/crio-49f5f5fd00feff5a546a9de445e5f213380ff4a94b71693b7491db70e6a56b29 WatchSource:0}: Error finding container 49f5f5fd00feff5a546a9de445e5f213380ff4a94b71693b7491db70e6a56b29: Status 404 returned error can't find the container with id 49f5f5fd00feff5a546a9de445e5f213380ff4a94b71693b7491db70e6a56b29 Apr 22 14:17:51.503559 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:51.503496 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" event={"ID":"78b16110-0b35-41a7-b840-f66b6fb4ac09","Type":"ContainerStarted","Data":"49f5f5fd00feff5a546a9de445e5f213380ff4a94b71693b7491db70e6a56b29"} Apr 22 14:17:51.505575 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:51.505542 2578 generic.go:358] "Generic (PLEG): container finished" podID="6e03df22-4c80-4e88-b0af-86bd967d1940" containerID="3acfb65ed2aa152fd24f5ac745dcb91a29e1058c41af6ccb8c89c18b774a6c1b" exitCode=255 Apr 22 14:17:51.505706 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:51.505608 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" event={"ID":"6e03df22-4c80-4e88-b0af-86bd967d1940","Type":"ContainerDied","Data":"3acfb65ed2aa152fd24f5ac745dcb91a29e1058c41af6ccb8c89c18b774a6c1b"} Apr 22 14:17:51.513864 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:51.513827 2578 scope.go:117] "RemoveContainer" containerID="3acfb65ed2aa152fd24f5ac745dcb91a29e1058c41af6ccb8c89c18b774a6c1b" Apr 22 14:17:52.512124 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:52.512044 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" event={"ID":"78b16110-0b35-41a7-b840-f66b6fb4ac09","Type":"ContainerStarted","Data":"68434ff3cfca85d8192570e7adde4b5608d30c98d6f08a6328ad4d47313fabfd"} Apr 22 14:17:52.514098 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:52.514031 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="open-cluster-management-agent-addon/managed-serviceaccount-addon-agent-698579449c-lg588" event={"ID":"6e03df22-4c80-4e88-b0af-86bd967d1940","Type":"ContainerStarted","Data":"045fe2cf63e0b185a1f2f87bb53369dea4c99eafa52955fdc9f87626f159bf59"} Apr 22 14:17:52.534066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:52.533960 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-4x527" podStartSLOduration=32.710826039 podStartE2EDuration="34.533939785s" podCreationTimestamp="2026-04-22 14:17:18 +0000 UTC" firstStartedPulling="2026-04-22 14:17:50.539867957 +0000 UTC m=+171.214671214" lastFinishedPulling="2026-04-22 14:17:52.362981694 +0000 UTC m=+173.037784960" observedRunningTime="2026-04-22 14:17:52.53216207 +0000 UTC m=+173.206965336" watchObservedRunningTime="2026-04-22 14:17:52.533939785 +0000 UTC m=+173.208743048" Apr 22 14:17:52.999526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:52.999494 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9"] Apr 22 14:17:53.002633 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.002603 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:17:53.005949 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.005906 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-tls\"" Apr 22 14:17:53.006089 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.006030 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-admission-webhook-dockercfg-mgbvt\"" Apr 22 14:17:53.016039 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.016012 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9"] Apr 22 14:17:53.100655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.100607 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6d2ba58b-9ecb-433c-9cd2-adebb69b6d62-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bj7j9\" (UID: \"6d2ba58b-9ecb-433c-9cd2-adebb69b6d62\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:17:53.201283 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.201242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6d2ba58b-9ecb-433c-9cd2-adebb69b6d62-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bj7j9\" (UID: \"6d2ba58b-9ecb-433c-9cd2-adebb69b6d62\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:17:53.204145 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.204114 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certificates\" (UniqueName: \"kubernetes.io/secret/6d2ba58b-9ecb-433c-9cd2-adebb69b6d62-tls-certificates\") pod \"prometheus-operator-admission-webhook-57cf98b594-bj7j9\" (UID: \"6d2ba58b-9ecb-433c-9cd2-adebb69b6d62\") " pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:17:53.313870 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.313775 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:17:53.458992 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.458958 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9"] Apr 22 14:17:53.462418 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:17:53.462384 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d2ba58b_9ecb_433c_9cd2_adebb69b6d62.slice/crio-d955b98a03fc575c871c840e653549cab58f9c19e4b96a5ee9eb66599d243961 WatchSource:0}: Error finding container d955b98a03fc575c871c840e653549cab58f9c19e4b96a5ee9eb66599d243961: Status 404 returned error can't find the container with id d955b98a03fc575c871c840e653549cab58f9c19e4b96a5ee9eb66599d243961 Apr 22 14:17:53.518633 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:53.518588 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" event={"ID":"6d2ba58b-9ecb-433c-9cd2-adebb69b6d62","Type":"ContainerStarted","Data":"d955b98a03fc575c871c840e653549cab58f9c19e4b96a5ee9eb66599d243961"} Apr 22 14:17:54.698237 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:54.697887 2578 patch_prober.go:28] interesting pod/image-registry-59b68764cb-dh9r7 container/registry namespace/openshift-image-registry: Readiness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 22 14:17:54.698237 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:54.697963 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" podUID="774fe819-0129-443d-bc87-ddbe8c62267a" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 22 14:17:55.484890 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:17:55.484858 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-kxldt" Apr 22 14:18:02.546841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.546801 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" event={"ID":"6d2ba58b-9ecb-433c-9cd2-adebb69b6d62","Type":"ContainerStarted","Data":"b6d9f598b6575ee46eeff670753ce663a45ab7a674e42a52794c076b291d011d"} Apr 22 14:18:02.547286 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.547109 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:18:02.548441 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.548406 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-kfnwb" event={"ID":"8fdb260a-3bb8-4141-8359-e18230a3d1ee","Type":"ContainerStarted","Data":"3cbecb6d25b78cacc6780e4f7e60714956a57b53d6fa9d89d65c8b85c3cdb7df"} Apr 22 14:18:02.548707 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.548670 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:18:02.553442 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.553411 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" Apr 22 14:18:02.558331 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.558284 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-kfnwb" Apr 22 14:18:02.578085 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.578026 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-admission-webhook-57cf98b594-bj7j9" podStartSLOduration=2.542600555 podStartE2EDuration="10.578008953s" podCreationTimestamp="2026-04-22 14:17:52 +0000 UTC" firstStartedPulling="2026-04-22 14:17:53.464583362 +0000 UTC m=+174.139386625" lastFinishedPulling="2026-04-22 14:18:01.499991771 +0000 UTC m=+182.174795023" observedRunningTime="2026-04-22 14:18:02.577267528 +0000 UTC m=+183.252070794" watchObservedRunningTime="2026-04-22 14:18:02.578008953 +0000 UTC m=+183.252812216" Apr 22 14:18:02.642472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:02.642409 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-kfnwb" podStartSLOduration=2.32465447 podStartE2EDuration="18.642389375s" podCreationTimestamp="2026-04-22 14:17:44 +0000 UTC" firstStartedPulling="2026-04-22 14:17:45.229277343 +0000 UTC m=+165.904080587" lastFinishedPulling="2026-04-22 14:18:01.547012244 +0000 UTC m=+182.221815492" observedRunningTime="2026-04-22 14:18:02.641278229 +0000 UTC m=+183.316081699" watchObservedRunningTime="2026-04-22 14:18:02.642389375 +0000 UTC m=+183.317192640" Apr 22 14:18:03.134967 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.134924 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dchw2"] Apr 22 14:18:03.141134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.141106 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.145378 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.145349 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 22 14:18:03.146137 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.146106 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 22 14:18:03.146591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.146564 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-dhpsr\"" Apr 22 14:18:03.146695 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.146636 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 22 14:18:03.156396 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.156366 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dchw2"] Apr 22 14:18:03.293661 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.293616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/763d841d-2a98-4990-ade8-3c083f6c8372-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.293831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.293783 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/763d841d-2a98-4990-ade8-3c083f6c8372-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.293907 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.293827 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/763d841d-2a98-4990-ade8-3c083f6c8372-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.293907 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.293884 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cljkt\" (UniqueName: \"kubernetes.io/projected/763d841d-2a98-4990-ade8-3c083f6c8372-kube-api-access-cljkt\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.394736 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.394616 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/763d841d-2a98-4990-ade8-3c083f6c8372-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.394736 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.394695 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/763d841d-2a98-4990-ade8-3c083f6c8372-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.394736 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.394732 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cljkt\" (UniqueName: \"kubernetes.io/projected/763d841d-2a98-4990-ade8-3c083f6c8372-kube-api-access-cljkt\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.395004 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.394776 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/763d841d-2a98-4990-ade8-3c083f6c8372-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.395560 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.395531 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/763d841d-2a98-4990-ade8-3c083f6c8372-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.397655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.397620 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/763d841d-2a98-4990-ade8-3c083f6c8372-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.397655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.397643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/763d841d-2a98-4990-ade8-3c083f6c8372-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.407801 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.407770 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cljkt\" (UniqueName: \"kubernetes.io/projected/763d841d-2a98-4990-ade8-3c083f6c8372-kube-api-access-cljkt\") pod \"prometheus-operator-5676c8c784-dchw2\" (UID: \"763d841d-2a98-4990-ade8-3c083f6c8372\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.452719 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.452674 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" Apr 22 14:18:03.591403 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:03.591370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-dchw2"] Apr 22 14:18:03.608470 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:18:03.608440 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod763d841d_2a98_4990_ade8_3c083f6c8372.slice/crio-b2bf7875c35e252a0a1f9bc35cf16dadd05de2e1002da4c4e4f64a8933abef1f WatchSource:0}: Error finding container b2bf7875c35e252a0a1f9bc35cf16dadd05de2e1002da4c4e4f64a8933abef1f: Status 404 returned error can't find the container with id b2bf7875c35e252a0a1f9bc35cf16dadd05de2e1002da4c4e4f64a8933abef1f Apr 22 14:18:04.556678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:04.556610 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" event={"ID":"763d841d-2a98-4990-ade8-3c083f6c8372","Type":"ContainerStarted","Data":"b2bf7875c35e252a0a1f9bc35cf16dadd05de2e1002da4c4e4f64a8933abef1f"} Apr 22 14:18:04.697588 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:04.697555 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:18:06.564322 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:06.564263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" event={"ID":"763d841d-2a98-4990-ade8-3c083f6c8372","Type":"ContainerStarted","Data":"09040e97817769c201611c711db8be4b6a36f33d7d01d4af5bd6a512109f861d"} Apr 22 14:18:06.564803 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:06.564329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" event={"ID":"763d841d-2a98-4990-ade8-3c083f6c8372","Type":"ContainerStarted","Data":"901fa7fe5b2118e842d3820b78a905c730ae5800101c47ff0506e024867d0018"} Apr 22 14:18:06.584635 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:06.584573 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-dchw2" podStartSLOduration=1.6514364179999999 podStartE2EDuration="3.584552614s" podCreationTimestamp="2026-04-22 14:18:03 +0000 UTC" firstStartedPulling="2026-04-22 14:18:03.610738904 +0000 UTC m=+184.285542146" lastFinishedPulling="2026-04-22 14:18:05.543855088 +0000 UTC m=+186.218658342" observedRunningTime="2026-04-22 14:18:06.584051519 +0000 UTC m=+187.258854786" watchObservedRunningTime="2026-04-22 14:18:06.584552614 +0000 UTC m=+187.259355878" Apr 22 14:18:08.709974 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.709935 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-998dw"] Apr 22 14:18:08.741227 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.741197 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.746726 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.746694 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 22 14:18:08.746868 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.746810 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 22 14:18:08.746868 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.746844 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 22 14:18:08.747889 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.747866 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-2pzx2\"" Apr 22 14:18:08.845667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845629 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-tls\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.845845 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845688 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-metrics-client-ca\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.845845 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845721 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-accelerators-collector-config\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.845845 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845751 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-root\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.845845 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-textfile\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.846045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845864 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5bnpn\" (UniqueName: \"kubernetes.io/projected/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-kube-api-access-5bnpn\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.846045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845882 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-wtmp\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.846045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845903 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.846045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.845972 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-sys\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.946505 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.946466 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-textfile\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.946788 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.946766 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5bnpn\" (UniqueName: \"kubernetes.io/projected/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-kube-api-access-5bnpn\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.946918 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.946902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-wtmp\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947039 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947152 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947132 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-sys\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947152 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.946829 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-textfile\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947357 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947163 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-tls\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947357 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947229 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-metrics-client-ca\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947357 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947262 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-accelerators-collector-config\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947357 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947320 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-root\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947550 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947395 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-root\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.947967 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.947937 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-accelerators-collector-config\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.948690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.948664 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-metrics-client-ca\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.948827 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.948804 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-wtmp\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.948929 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.948859 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-sys\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.950007 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.949973 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-tls\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.950107 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.950005 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:08.957317 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:08.957252 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5bnpn\" (UniqueName: \"kubernetes.io/projected/0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9-kube-api-access-5bnpn\") pod \"node-exporter-998dw\" (UID: \"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9\") " pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:09.051817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:09.051734 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-998dw" Apr 22 14:18:09.062113 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:18:09.062074 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0b47ac76_f656_4ddd_b1b8_4ce0dbaa74d9.slice/crio-460d5302e878d204acc7dc010a2d19c93df7fa03b3a4984e0042e164058be0b6 WatchSource:0}: Error finding container 460d5302e878d204acc7dc010a2d19c93df7fa03b3a4984e0042e164058be0b6: Status 404 returned error can't find the container with id 460d5302e878d204acc7dc010a2d19c93df7fa03b3a4984e0042e164058be0b6 Apr 22 14:18:09.576021 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:09.575984 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-998dw" event={"ID":"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9","Type":"ContainerStarted","Data":"460d5302e878d204acc7dc010a2d19c93df7fa03b3a4984e0042e164058be0b6"} Apr 22 14:18:09.712665 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:09.712482 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" podUID="774fe819-0129-443d-bc87-ddbe8c62267a" containerName="registry" containerID="cri-o://cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e" gracePeriod=30 Apr 22 14:18:10.091637 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.091610 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:18:10.157419 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157383 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-trusted-ca\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157431 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-registry-certificates\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157461 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-installation-pull-secrets\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157492 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157527 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/774fe819-0129-443d-bc87-ddbe8c62267a-ca-trust-extracted\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157582 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-image-registry-private-configuration\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157606 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-jz428\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-kube-api-access-jz428\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157644 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-bound-sa-token\") pod \"774fe819-0129-443d-bc87-ddbe8c62267a\" (UID: \"774fe819-0129-443d-bc87-ddbe8c62267a\") " Apr 22 14:18:10.157923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157879 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:10.158028 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.157958 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:10.160244 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.160184 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:10.160774 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.160281 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-kube-api-access-jz428" (OuterVolumeSpecName: "kube-api-access-jz428") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "kube-api-access-jz428". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:10.160774 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.160623 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:10.161070 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.160895 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:10.161280 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.161161 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:10.170349 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.170315 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/774fe819-0129-443d-bc87-ddbe8c62267a-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "774fe819-0129-443d-bc87-ddbe8c62267a" (UID: "774fe819-0129-443d-bc87-ddbe8c62267a"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:18:10.258320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258227 2578 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-image-registry-private-configuration\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258256 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-jz428\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-kube-api-access-jz428\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258265 2578 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-bound-sa-token\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258275 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-trusted-ca\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258285 2578 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/774fe819-0129-443d-bc87-ddbe8c62267a-registry-certificates\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258325 2578 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/774fe819-0129-443d-bc87-ddbe8c62267a-installation-pull-secrets\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258755 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258342 2578 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/774fe819-0129-443d-bc87-ddbe8c62267a-registry-tls\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.258755 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.258355 2578 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/774fe819-0129-443d-bc87-ddbe8c62267a-ca-trust-extracted\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:10.530662 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.530625 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws"] Apr 22 14:18:10.530986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.530964 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="774fe819-0129-443d-bc87-ddbe8c62267a" containerName="registry" Apr 22 14:18:10.531080 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.530988 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="774fe819-0129-443d-bc87-ddbe8c62267a" containerName="registry" Apr 22 14:18:10.531080 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.531052 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="774fe819-0129-443d-bc87-ddbe8c62267a" containerName="registry" Apr 22 14:18:10.562125 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.562084 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws"] Apr 22 14:18:10.562287 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.562244 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.565273 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565121 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-metrics\"" Apr 22 14:18:10.565273 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565156 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy\"" Apr 22 14:18:10.565273 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565250 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-tls\"" Apr 22 14:18:10.565273 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565121 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-rules\"" Apr 22 14:18:10.565649 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565161 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-dockercfg-plvsx\"" Apr 22 14:18:10.565649 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565563 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-grpc-tls-ceo1621dv1f8j\"" Apr 22 14:18:10.565745 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.565689 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"thanos-querier-kube-rbac-proxy-web\"" Apr 22 14:18:10.581861 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.581823 2578 generic.go:358] "Generic (PLEG): container finished" podID="774fe819-0129-443d-bc87-ddbe8c62267a" containerID="cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e" exitCode=0 Apr 22 14:18:10.582131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.582090 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" Apr 22 14:18:10.582288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.582082 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" event={"ID":"774fe819-0129-443d-bc87-ddbe8c62267a","Type":"ContainerDied","Data":"cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e"} Apr 22 14:18:10.582397 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.582342 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-59b68764cb-dh9r7" event={"ID":"774fe819-0129-443d-bc87-ddbe8c62267a","Type":"ContainerDied","Data":"6e48419fe1a95c006434857505a589c599d25098f8c5896e863b2e15a2b533bd"} Apr 22 14:18:10.582397 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.582383 2578 scope.go:117] "RemoveContainer" containerID="cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e" Apr 22 14:18:10.585939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.585810 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-998dw" event={"ID":"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9","Type":"ContainerStarted","Data":"bff4237ca3db1a04122b6508642c59f20b30853002574e34d02fb648e7f7a089"} Apr 22 14:18:10.594684 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.594658 2578 scope.go:117] "RemoveContainer" containerID="cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e" Apr 22 14:18:10.595032 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:18:10.595000 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e\": container with ID starting with cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e not found: ID does not exist" containerID="cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e" Apr 22 14:18:10.595118 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.595045 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e"} err="failed to get container status \"cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e\": rpc error: code = NotFound desc = could not find container \"cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e\": container with ID starting with cd620871f92d9c07743d2602f794a69406ea4084be132c409ad4593e8f5bc84e not found: ID does not exist" Apr 22 14:18:10.615524 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.615493 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-59b68764cb-dh9r7"] Apr 22 14:18:10.620709 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.620671 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-59b68764cb-dh9r7"] Apr 22 14:18:10.662345 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662113 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662345 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662345 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662257 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6sznl\" (UniqueName: \"kubernetes.io/projected/ed392dfd-aa36-4d8c-a598-160a741281e8-kube-api-access-6sznl\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662345 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662285 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-grpc-tls\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662675 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662393 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-tls\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662675 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662443 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662675 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed392dfd-aa36-4d8c-a598-160a741281e8-metrics-client-ca\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.662675 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.662520 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.763673 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763591 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.763673 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763658 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6sznl\" (UniqueName: \"kubernetes.io/projected/ed392dfd-aa36-4d8c-a598-160a741281e8-kube-api-access-6sznl\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763812 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-grpc-tls\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763864 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-tls\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763902 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763932 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed392dfd-aa36-4d8c-a598-160a741281e8-metrics-client-ca\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.763957 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.764024 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.764738 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.764713 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/ed392dfd-aa36-4d8c-a598-160a741281e8-metrics-client-ca\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.766711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.766679 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-rules\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-rules\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.766820 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.766769 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.766820 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.766798 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-web\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.767338 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.767292 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-tls\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-tls\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.767436 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.767343 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-grpc-tls\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-grpc-tls\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.767498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.767431 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-thanos-querier-kube-rbac-proxy-metrics\" (UniqueName: \"kubernetes.io/secret/ed392dfd-aa36-4d8c-a598-160a741281e8-secret-thanos-querier-kube-rbac-proxy-metrics\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.781758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.781728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6sznl\" (UniqueName: \"kubernetes.io/projected/ed392dfd-aa36-4d8c-a598-160a741281e8-kube-api-access-6sznl\") pod \"thanos-querier-57b7dd87cb-9s2ws\" (UID: \"ed392dfd-aa36-4d8c-a598-160a741281e8\") " pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:10.874216 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:10.874174 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:11.034570 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:11.034462 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws"] Apr 22 14:18:11.038656 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:18:11.038615 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poded392dfd_aa36_4d8c_a598_160a741281e8.slice/crio-bb677e775c85977b1220e269617202f85b0dc50b78c8fadfd6803e1b4c446a19 WatchSource:0}: Error finding container bb677e775c85977b1220e269617202f85b0dc50b78c8fadfd6803e1b4c446a19: Status 404 returned error can't find the container with id bb677e775c85977b1220e269617202f85b0dc50b78c8fadfd6803e1b4c446a19 Apr 22 14:18:11.591180 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:11.591137 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"bb677e775c85977b1220e269617202f85b0dc50b78c8fadfd6803e1b4c446a19"} Apr 22 14:18:11.592790 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:11.592757 2578 generic.go:358] "Generic (PLEG): container finished" podID="0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9" containerID="bff4237ca3db1a04122b6508642c59f20b30853002574e34d02fb648e7f7a089" exitCode=0 Apr 22 14:18:11.592965 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:11.592803 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-998dw" event={"ID":"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9","Type":"ContainerDied","Data":"bff4237ca3db1a04122b6508642c59f20b30853002574e34d02fb648e7f7a089"} Apr 22 14:18:11.951933 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:11.951894 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="774fe819-0129-443d-bc87-ddbe8c62267a" path="/var/lib/kubelet/pods/774fe819-0129-443d-bc87-ddbe8c62267a/volumes" Apr 22 14:18:12.597874 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:12.597838 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-998dw" event={"ID":"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9","Type":"ContainerStarted","Data":"e3c6ad177ca6fbcf29f0e60634ea8a0d90579ff7a10b5f6723830c361b643241"} Apr 22 14:18:12.598137 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:12.597880 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-998dw" event={"ID":"0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9","Type":"ContainerStarted","Data":"9d41f0c27907d2d8c9b33e08aec53ee213248f87501564e89a54d2e76705bf60"} Apr 22 14:18:12.621856 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:12.621795 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-998dw" podStartSLOduration=3.360285971 podStartE2EDuration="4.62177518s" podCreationTimestamp="2026-04-22 14:18:08 +0000 UTC" firstStartedPulling="2026-04-22 14:18:09.064437972 +0000 UTC m=+189.739241217" lastFinishedPulling="2026-04-22 14:18:10.325927171 +0000 UTC m=+191.000730426" observedRunningTime="2026-04-22 14:18:12.620711418 +0000 UTC m=+193.295514693" watchObservedRunningTime="2026-04-22 14:18:12.62177518 +0000 UTC m=+193.296578447" Apr 22 14:18:13.280028 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.279991 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58758fd9f6-vxmht"] Apr 22 14:18:13.305085 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.305053 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58758fd9f6-vxmht"] Apr 22 14:18:13.305238 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.305183 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.308623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.308594 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 22 14:18:13.310127 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.309814 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 22 14:18:13.310127 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.309902 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 22 14:18:13.310127 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.309904 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-78jk5\"" Apr 22 14:18:13.310127 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.310026 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 22 14:18:13.310442 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.310189 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 22 14:18:13.310442 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.310271 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h"] Apr 22 14:18:13.314279 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.314252 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 22 14:18:13.329466 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.329437 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h"] Apr 22 14:18:13.329606 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.329573 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:13.332127 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.332104 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"monitoring-plugin-cert\"" Apr 22 14:18:13.332247 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.332134 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"default-dockercfg-zmpz4\"" Apr 22 14:18:13.386205 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386174 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-oauth-config\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.386405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386217 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-oauth-serving-cert\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.386405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386240 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-console-config\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.386405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386264 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-serving-cert\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.386405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386293 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-service-ca\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.386405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386357 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ml2v7\" (UniqueName: \"kubernetes.io/projected/eb6cda05-5633-4ff3-9938-3bde0196062f-kube-api-access-ml2v7\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.386405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.386383 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-trusted-ca-bundle\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487308 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487277 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-oauth-config\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487337 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-oauth-serving-cert\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487363 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-console-config\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487395 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-serving-cert\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-service-ca\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487460 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2cdf65a3-3a9c-48f6-be02-eae076517f5d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-p4m4h\" (UID: \"2cdf65a3-3a9c-48f6-be02-eae076517f5d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:13.487776 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487490 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ml2v7\" (UniqueName: \"kubernetes.io/projected/eb6cda05-5633-4ff3-9938-3bde0196062f-kube-api-access-ml2v7\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.487776 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.487549 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-trusted-ca-bundle\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.488126 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.488097 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-oauth-serving-cert\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.488237 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.488187 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-console-config\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.488237 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.488198 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-service-ca\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.488614 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.488593 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-trusted-ca-bundle\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.489922 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.489902 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-oauth-config\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.490010 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.489997 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-serving-cert\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.504142 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.504121 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ml2v7\" (UniqueName: \"kubernetes.io/projected/eb6cda05-5633-4ff3-9938-3bde0196062f-kube-api-access-ml2v7\") pod \"console-58758fd9f6-vxmht\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.588443 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.588420 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2cdf65a3-3a9c-48f6-be02-eae076517f5d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-p4m4h\" (UID: \"2cdf65a3-3a9c-48f6-be02-eae076517f5d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:13.590893 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.590864 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"monitoring-plugin-cert\" (UniqueName: \"kubernetes.io/secret/2cdf65a3-3a9c-48f6-be02-eae076517f5d-monitoring-plugin-cert\") pod \"monitoring-plugin-7dccd58f55-p4m4h\" (UID: \"2cdf65a3-3a9c-48f6-be02-eae076517f5d\") " pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:13.617115 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.617069 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:13.638360 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.638324 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:13.769667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.769639 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58758fd9f6-vxmht"] Apr 22 14:18:13.771880 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:18:13.771847 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podeb6cda05_5633_4ff3_9938_3bde0196062f.slice/crio-b29eb7abf285f6cfd65c7c2a1202b5f65a5bf2e5be2fe8a9b1cc1fd7642d9033 WatchSource:0}: Error finding container b29eb7abf285f6cfd65c7c2a1202b5f65a5bf2e5be2fe8a9b1cc1fd7642d9033: Status 404 returned error can't find the container with id b29eb7abf285f6cfd65c7c2a1202b5f65a5bf2e5be2fe8a9b1cc1fd7642d9033 Apr 22 14:18:13.803785 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:13.803753 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h"] Apr 22 14:18:13.806628 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:18:13.806599 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2cdf65a3_3a9c_48f6_be02_eae076517f5d.slice/crio-d49a8d2f5712de581813ff416ab85692368e2895f1add77500acda3a52b8cf7b WatchSource:0}: Error finding container d49a8d2f5712de581813ff416ab85692368e2895f1add77500acda3a52b8cf7b: Status 404 returned error can't find the container with id d49a8d2f5712de581813ff416ab85692368e2895f1add77500acda3a52b8cf7b Apr 22 14:18:14.611948 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:14.611908 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" event={"ID":"2cdf65a3-3a9c-48f6-be02-eae076517f5d","Type":"ContainerStarted","Data":"d49a8d2f5712de581813ff416ab85692368e2895f1add77500acda3a52b8cf7b"} Apr 22 14:18:14.614252 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:14.614222 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58758fd9f6-vxmht" event={"ID":"eb6cda05-5633-4ff3-9938-3bde0196062f","Type":"ContainerStarted","Data":"b29eb7abf285f6cfd65c7c2a1202b5f65a5bf2e5be2fe8a9b1cc1fd7642d9033"} Apr 22 14:18:14.617132 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:14.617047 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"574c780dd9455f04bc12724fe7ff486e154a1554a9475e3078c24ecb22f4d4ca"} Apr 22 14:18:14.617132 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:14.617084 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"b42a438f59b9620fec2fa0abff328c8ecf63d684c4eb00608c5fd55e07db8720"} Apr 22 14:18:14.617132 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:14.617098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"819d4e33738200e5db9b6de5bf7c5a3e9a2ec943a1b18a066f5c16401b95c5b6"} Apr 22 14:18:17.628712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:17.628657 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"b2648da810ffb804eb0e6d7de42fea84cac545c5e5a7075665f291ef47882182"} Apr 22 14:18:17.628712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:17.628717 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"65381b733e6f4bfbbc144cbdc6586542da9d69cc5bc81ae6ddf335b473fa0a61"} Apr 22 14:18:17.629855 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:17.629833 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" event={"ID":"2cdf65a3-3a9c-48f6-be02-eae076517f5d","Type":"ContainerStarted","Data":"9cae3871789c23ac545ef0d5ba3c937e5a662918594f1b64c03d319b90b98ed4"} Apr 22 14:18:17.630136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:17.630114 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:17.634935 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:17.634907 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" Apr 22 14:18:17.647168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:17.647128 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/monitoring-plugin-7dccd58f55-p4m4h" podStartSLOduration=1.340655384 podStartE2EDuration="4.647116091s" podCreationTimestamp="2026-04-22 14:18:13 +0000 UTC" firstStartedPulling="2026-04-22 14:18:13.808476294 +0000 UTC m=+194.483279538" lastFinishedPulling="2026-04-22 14:18:17.114936986 +0000 UTC m=+197.789740245" observedRunningTime="2026-04-22 14:18:17.64613016 +0000 UTC m=+198.320933425" watchObservedRunningTime="2026-04-22 14:18:17.647116091 +0000 UTC m=+198.321919364" Apr 22 14:18:18.223665 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.223629 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58758fd9f6-vxmht"] Apr 22 14:18:18.260549 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.260510 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-6f6648c54c-r58c5"] Apr 22 14:18:18.278903 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.278874 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f6648c54c-r58c5"] Apr 22 14:18:18.279079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.279013 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433445 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433408 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-serving-cert\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433445 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433450 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-oauth-serving-cert\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433728 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433525 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-service-ca\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433728 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433568 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cq9xx\" (UniqueName: \"kubernetes.io/projected/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-kube-api-access-cq9xx\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433728 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433680 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-config\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433728 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-trusted-ca-bundle\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.433886 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.433750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-oauth-config\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.534850 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534762 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-serving-cert\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.534850 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534799 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-oauth-serving-cert\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.534850 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534820 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-service-ca\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.534850 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534844 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cq9xx\" (UniqueName: \"kubernetes.io/projected/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-kube-api-access-cq9xx\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.535173 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534927 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-config\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.535173 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534953 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-trusted-ca-bundle\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.535173 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.534987 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-oauth-config\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.535655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.535629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-config\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.535769 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.535692 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-service-ca\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.535769 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.535742 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-oauth-serving-cert\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.536040 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.536019 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-trusted-ca-bundle\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.537466 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.537441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-oauth-config\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.537552 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.537482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-serving-cert\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.542648 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.542629 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cq9xx\" (UniqueName: \"kubernetes.io/projected/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-kube-api-access-cq9xx\") pod \"console-6f6648c54c-r58c5\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.588828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.588789 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:18.637777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.637731 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" event={"ID":"ed392dfd-aa36-4d8c-a598-160a741281e8","Type":"ContainerStarted","Data":"860ee93bd92920da50e5dc78f2931ab7b9f4a7fb3761c0a2b6e7975e773e3624"} Apr 22 14:18:18.638674 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.638648 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:18.639998 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.639969 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58758fd9f6-vxmht" event={"ID":"eb6cda05-5633-4ff3-9938-3bde0196062f","Type":"ContainerStarted","Data":"3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254"} Apr 22 14:18:18.645928 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.645899 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" Apr 22 14:18:18.677153 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.677091 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/thanos-querier-57b7dd87cb-9s2ws" podStartSLOduration=2.603343096 podStartE2EDuration="8.677075486s" podCreationTimestamp="2026-04-22 14:18:10 +0000 UTC" firstStartedPulling="2026-04-22 14:18:11.041206769 +0000 UTC m=+191.716010025" lastFinishedPulling="2026-04-22 14:18:17.11493917 +0000 UTC m=+197.789742415" observedRunningTime="2026-04-22 14:18:18.676527244 +0000 UTC m=+199.351330510" watchObservedRunningTime="2026-04-22 14:18:18.677075486 +0000 UTC m=+199.351878750" Apr 22 14:18:18.714260 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.714227 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-6f6648c54c-r58c5"] Apr 22 14:18:18.717899 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:18.717850 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58758fd9f6-vxmht" podStartSLOduration=1.968310405 podStartE2EDuration="5.717833652s" podCreationTimestamp="2026-04-22 14:18:13 +0000 UTC" firstStartedPulling="2026-04-22 14:18:13.773837108 +0000 UTC m=+194.448640351" lastFinishedPulling="2026-04-22 14:18:17.523360355 +0000 UTC m=+198.198163598" observedRunningTime="2026-04-22 14:18:18.716731775 +0000 UTC m=+199.391535041" watchObservedRunningTime="2026-04-22 14:18:18.717833652 +0000 UTC m=+199.392636918" Apr 22 14:18:18.718431 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:18:18.718407 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode7c9cb0e_e6e5_43cd_a14e_efb10585d395.slice/crio-6f99ff03ada5effb912d596280584a3967a4f6d1434265d6779932e0b750e829 WatchSource:0}: Error finding container 6f99ff03ada5effb912d596280584a3967a4f6d1434265d6779932e0b750e829: Status 404 returned error can't find the container with id 6f99ff03ada5effb912d596280584a3967a4f6d1434265d6779932e0b750e829 Apr 22 14:18:19.646139 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:19.646103 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6648c54c-r58c5" event={"ID":"e7c9cb0e-e6e5-43cd-a14e-efb10585d395","Type":"ContainerStarted","Data":"ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024"} Apr 22 14:18:19.646139 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:19.646145 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6648c54c-r58c5" event={"ID":"e7c9cb0e-e6e5-43cd-a14e-efb10585d395","Type":"ContainerStarted","Data":"6f99ff03ada5effb912d596280584a3967a4f6d1434265d6779932e0b750e829"} Apr 22 14:18:19.666816 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:19.666766 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-6f6648c54c-r58c5" podStartSLOduration=1.666750166 podStartE2EDuration="1.666750166s" podCreationTimestamp="2026-04-22 14:18:18 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:18:19.66611716 +0000 UTC m=+200.340920439" watchObservedRunningTime="2026-04-22 14:18:19.666750166 +0000 UTC m=+200.341553431" Apr 22 14:18:23.617815 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:23.617775 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:28.589812 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:28.589781 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:28.590276 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:28.589828 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:28.594641 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:28.594617 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:28.675442 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:28.675414 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:18:36.696738 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:36.696696 2578 generic.go:358] "Generic (PLEG): container finished" podID="e0d665af-9e8f-41f5-bc80-5b21a812d08d" containerID="4009fc62dbadd4119ff873a7bdc6a29b1ad6a3091eeccb110cbd3121bcc17c10" exitCode=0 Apr 22 14:18:36.697106 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:36.696770 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" event={"ID":"e0d665af-9e8f-41f5-bc80-5b21a812d08d","Type":"ContainerDied","Data":"4009fc62dbadd4119ff873a7bdc6a29b1ad6a3091eeccb110cbd3121bcc17c10"} Apr 22 14:18:36.697106 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:36.697073 2578 scope.go:117] "RemoveContainer" containerID="4009fc62dbadd4119ff873a7bdc6a29b1ad6a3091eeccb110cbd3121bcc17c10" Apr 22 14:18:37.703729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:37.703695 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca-operator/service-ca-operator-d6fc45fc5-88d4q" event={"ID":"e0d665af-9e8f-41f5-bc80-5b21a812d08d","Type":"ContainerStarted","Data":"54028c8cdc7d1929a8f09e2dec9f42fa083d798608f8bf365ad5e2aa0ce8e785"} Apr 22 14:18:44.667702 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:44.667638 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58758fd9f6-vxmht" podUID="eb6cda05-5633-4ff3-9938-3bde0196062f" containerName="console" containerID="cri-o://3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254" gracePeriod=15 Apr 22 14:18:44.923022 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:44.922966 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58758fd9f6-vxmht_eb6cda05-5633-4ff3-9938-3bde0196062f/console/0.log" Apr 22 14:18:44.923130 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:44.923037 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:45.045010 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.044971 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-oauth-serving-cert\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045010 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045013 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-oauth-config\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045035 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-trusted-ca-bundle\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045063 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-console-config\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045082 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ml2v7\" (UniqueName: \"kubernetes.io/projected/eb6cda05-5633-4ff3-9938-3bde0196062f-kube-api-access-ml2v7\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045103 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-service-ca\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045148 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-serving-cert\") pod \"eb6cda05-5633-4ff3-9938-3bde0196062f\" (UID: \"eb6cda05-5633-4ff3-9938-3bde0196062f\") " Apr 22 14:18:45.045528 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045502 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:45.045743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045699 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-console-config" (OuterVolumeSpecName: "console-config") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:45.045743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045714 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-service-ca" (OuterVolumeSpecName: "service-ca") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:45.045872 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.045788 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:18:45.047455 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.047422 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:45.047552 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.047501 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:18:45.047552 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.047535 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/eb6cda05-5633-4ff3-9938-3bde0196062f-kube-api-access-ml2v7" (OuterVolumeSpecName: "kube-api-access-ml2v7") pod "eb6cda05-5633-4ff3-9938-3bde0196062f" (UID: "eb6cda05-5633-4ff3-9938-3bde0196062f"). InnerVolumeSpecName "kube-api-access-ml2v7". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:18:45.146605 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146562 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-oauth-serving-cert\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.146605 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146597 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-oauth-config\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.146605 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146606 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-trusted-ca-bundle\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.146605 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146616 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-console-config\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.146860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146625 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ml2v7\" (UniqueName: \"kubernetes.io/projected/eb6cda05-5633-4ff3-9938-3bde0196062f-kube-api-access-ml2v7\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.146860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146634 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/eb6cda05-5633-4ff3-9938-3bde0196062f-service-ca\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.146860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.146643 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/eb6cda05-5633-4ff3-9938-3bde0196062f-console-serving-cert\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:18:45.728339 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.728291 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58758fd9f6-vxmht_eb6cda05-5633-4ff3-9938-3bde0196062f/console/0.log" Apr 22 14:18:45.728771 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.728367 2578 generic.go:358] "Generic (PLEG): container finished" podID="eb6cda05-5633-4ff3-9938-3bde0196062f" containerID="3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254" exitCode=2 Apr 22 14:18:45.728771 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.728431 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58758fd9f6-vxmht" Apr 22 14:18:45.728771 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.728456 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58758fd9f6-vxmht" event={"ID":"eb6cda05-5633-4ff3-9938-3bde0196062f","Type":"ContainerDied","Data":"3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254"} Apr 22 14:18:45.728771 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.728493 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58758fd9f6-vxmht" event={"ID":"eb6cda05-5633-4ff3-9938-3bde0196062f","Type":"ContainerDied","Data":"b29eb7abf285f6cfd65c7c2a1202b5f65a5bf2e5be2fe8a9b1cc1fd7642d9033"} Apr 22 14:18:45.728771 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.728511 2578 scope.go:117] "RemoveContainer" containerID="3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254" Apr 22 14:18:45.737085 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.737063 2578 scope.go:117] "RemoveContainer" containerID="3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254" Apr 22 14:18:45.737513 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:18:45.737491 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254\": container with ID starting with 3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254 not found: ID does not exist" containerID="3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254" Apr 22 14:18:45.737573 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.737523 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254"} err="failed to get container status \"3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254\": rpc error: code = NotFound desc = could not find container \"3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254\": container with ID starting with 3fecf28e0d4ace9f46e9d949e65b5f6119e6c93f82f669e09338204f3fe2c254 not found: ID does not exist" Apr 22 14:18:45.752118 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.752089 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58758fd9f6-vxmht"] Apr 22 14:18:45.755637 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.755614 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58758fd9f6-vxmht"] Apr 22 14:18:45.951467 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:18:45.951432 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="eb6cda05-5633-4ff3-9938-3bde0196062f" path="/var/lib/kubelet/pods/eb6cda05-5633-4ff3-9938-3bde0196062f/volumes" Apr 22 14:19:11.668795 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:11.668699 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:19:11.671197 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:11.671148 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/aee73a14-6669-4d65-8987-69628270ae6d-metrics-certs\") pod \"network-metrics-daemon-8q2mm\" (UID: \"aee73a14-6669-4d65-8987-69628270ae6d\") " pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:19:11.751113 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:11.751080 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-4nhsh\"" Apr 22 14:19:11.758449 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:11.758425 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-8q2mm" Apr 22 14:19:11.880411 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:11.880337 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-8q2mm"] Apr 22 14:19:11.882875 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:19:11.882834 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podaee73a14_6669_4d65_8987_69628270ae6d.slice/crio-972051daeaa527f84be2d085cd12587b0225c51c2cd6bc15ce77517e0be9d223 WatchSource:0}: Error finding container 972051daeaa527f84be2d085cd12587b0225c51c2cd6bc15ce77517e0be9d223: Status 404 returned error can't find the container with id 972051daeaa527f84be2d085cd12587b0225c51c2cd6bc15ce77517e0be9d223 Apr 22 14:19:12.808581 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:12.808544 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8q2mm" event={"ID":"aee73a14-6669-4d65-8987-69628270ae6d","Type":"ContainerStarted","Data":"972051daeaa527f84be2d085cd12587b0225c51c2cd6bc15ce77517e0be9d223"} Apr 22 14:19:13.812521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:13.812484 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8q2mm" event={"ID":"aee73a14-6669-4d65-8987-69628270ae6d","Type":"ContainerStarted","Data":"aab8d828f138657d0f844598ec64010a5e758732bfc84c6a8e740655696e2a77"} Apr 22 14:19:13.812521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:13.812522 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-8q2mm" event={"ID":"aee73a14-6669-4d65-8987-69628270ae6d","Type":"ContainerStarted","Data":"ef00fcd0c7949e3e600cb6674e3d0b1f163ed71c279f34cc5ff278d3b9ce76e1"} Apr 22 14:19:13.830187 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:13.830134 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-8q2mm" podStartSLOduration=252.875310872 podStartE2EDuration="4m13.830120255s" podCreationTimestamp="2026-04-22 14:15:00 +0000 UTC" firstStartedPulling="2026-04-22 14:19:11.884693105 +0000 UTC m=+252.559496349" lastFinishedPulling="2026-04-22 14:19:12.839502486 +0000 UTC m=+253.514305732" observedRunningTime="2026-04-22 14:19:13.828123989 +0000 UTC m=+254.502927254" watchObservedRunningTime="2026-04-22 14:19:13.830120255 +0000 UTC m=+254.504923519" Apr 22 14:19:25.862220 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.862186 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-89c5f8668-4vhtf"] Apr 22 14:19:25.862721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.862486 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="eb6cda05-5633-4ff3-9938-3bde0196062f" containerName="console" Apr 22 14:19:25.862721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.862497 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="eb6cda05-5633-4ff3-9938-3bde0196062f" containerName="console" Apr 22 14:19:25.862721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.862546 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="eb6cda05-5633-4ff3-9938-3bde0196062f" containerName="console" Apr 22 14:19:25.866569 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.866546 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.894759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.894729 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89c5f8668-4vhtf"] Apr 22 14:19:25.981460 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981417 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-trusted-ca-bundle\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.981460 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-service-ca\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.981729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981484 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-oauth-serving-cert\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.981729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981507 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vjrjq\" (UniqueName: \"kubernetes.io/projected/76b92da7-a1a7-4d46-b3af-62c10c9da34a-kube-api-access-vjrjq\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.981729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981579 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-oauth-config\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.981729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981641 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-serving-cert\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:25.981729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:25.981703 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-config\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.082988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.082940 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-trusted-ca-bundle\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.082988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.082991 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-service-ca\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.082988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083010 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-oauth-serving-cert\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083279 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083029 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vjrjq\" (UniqueName: \"kubernetes.io/projected/76b92da7-a1a7-4d46-b3af-62c10c9da34a-kube-api-access-vjrjq\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083279 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083047 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-oauth-config\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083279 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083079 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-serving-cert\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083279 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083108 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-config\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083806 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083783 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-service-ca\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083902 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083845 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-oauth-serving-cert\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083902 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083876 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-config\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.083980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.083917 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-trusted-ca-bundle\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.086025 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.086004 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-oauth-config\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.086164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.086144 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-serving-cert\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.092265 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.092247 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vjrjq\" (UniqueName: \"kubernetes.io/projected/76b92da7-a1a7-4d46-b3af-62c10c9da34a-kube-api-access-vjrjq\") pod \"console-89c5f8668-4vhtf\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.175932 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.175905 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:26.316738 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.316548 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-89c5f8668-4vhtf"] Apr 22 14:19:26.319416 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:19:26.319382 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod76b92da7_a1a7_4d46_b3af_62c10c9da34a.slice/crio-79f6bac61913ec2336327f4dd68589ae66be06378e48855f6a279e554109d903 WatchSource:0}: Error finding container 79f6bac61913ec2336327f4dd68589ae66be06378e48855f6a279e554109d903: Status 404 returned error can't find the container with id 79f6bac61913ec2336327f4dd68589ae66be06378e48855f6a279e554109d903 Apr 22 14:19:26.852635 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.852600 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89c5f8668-4vhtf" event={"ID":"76b92da7-a1a7-4d46-b3af-62c10c9da34a","Type":"ContainerStarted","Data":"ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d"} Apr 22 14:19:26.852635 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.852634 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89c5f8668-4vhtf" event={"ID":"76b92da7-a1a7-4d46-b3af-62c10c9da34a","Type":"ContainerStarted","Data":"79f6bac61913ec2336327f4dd68589ae66be06378e48855f6a279e554109d903"} Apr 22 14:19:26.871047 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:26.871000 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-89c5f8668-4vhtf" podStartSLOduration=1.870986468 podStartE2EDuration="1.870986468s" podCreationTimestamp="2026-04-22 14:19:25 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:19:26.869645589 +0000 UTC m=+267.544448854" watchObservedRunningTime="2026-04-22 14:19:26.870986468 +0000 UTC m=+267.545789733" Apr 22 14:19:36.177085 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:36.177052 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:36.177085 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:36.177095 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:36.181947 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:36.181920 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:36.886041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:36.886012 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:19:36.939008 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:36.938970 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f6648c54c-r58c5"] Apr 22 14:19:53.792935 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.792901 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-hwfss"] Apr 22 14:19:53.800000 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.799976 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.807568 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.807514 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 22 14:19:53.808390 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.808364 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hwfss"] Apr 22 14:19:53.890743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.890699 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-kubelet-config\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.890743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.890750 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-dbus\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.890946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.890787 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.991430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.991399 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-kubelet-config\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.991597 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.991437 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-dbus\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.991597 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.991519 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-kubelet-config\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.991597 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.991528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.991713 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.991651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-dbus\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:53.993782 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:53.993762 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e-original-pull-secret\") pod \"global-pull-secret-syncer-hwfss\" (UID: \"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e\") " pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:54.110119 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:54.110030 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-hwfss" Apr 22 14:19:54.231325 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:54.231283 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-hwfss"] Apr 22 14:19:54.233788 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:19:54.233759 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbd7a4001_4bdb_4908_9ffe_6cd7422ebd4e.slice/crio-1523f7ca53d9b47cba0f4db4d0b6bd3cc5d980e0351eb51ec9d8dac82dc3e863 WatchSource:0}: Error finding container 1523f7ca53d9b47cba0f4db4d0b6bd3cc5d980e0351eb51ec9d8dac82dc3e863: Status 404 returned error can't find the container with id 1523f7ca53d9b47cba0f4db4d0b6bd3cc5d980e0351eb51ec9d8dac82dc3e863 Apr 22 14:19:54.932803 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:54.932763 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hwfss" event={"ID":"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e","Type":"ContainerStarted","Data":"1523f7ca53d9b47cba0f4db4d0b6bd3cc5d980e0351eb51ec9d8dac82dc3e863"} Apr 22 14:19:58.946486 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:58.946447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-hwfss" event={"ID":"bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e","Type":"ContainerStarted","Data":"46576d080e5bbfc483e10513ab250bb1d2c88de3865cbc014bc1e4e5c168ed06"} Apr 22 14:19:58.961678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:58.961622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-hwfss" podStartSLOduration=1.968987437 podStartE2EDuration="5.961605444s" podCreationTimestamp="2026-04-22 14:19:53 +0000 UTC" firstStartedPulling="2026-04-22 14:19:54.235418862 +0000 UTC m=+294.910222107" lastFinishedPulling="2026-04-22 14:19:58.228036857 +0000 UTC m=+298.902840114" observedRunningTime="2026-04-22 14:19:58.960745721 +0000 UTC m=+299.635548987" watchObservedRunningTime="2026-04-22 14:19:58.961605444 +0000 UTC m=+299.636408708" Apr 22 14:19:59.834113 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:59.834082 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:19:59.834916 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:59.834894 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:19:59.840785 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:19:59.840759 2578 kubelet.go:1628] "Image garbage collection succeeded" Apr 22 14:20:01.958880 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:01.958814 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-6f6648c54c-r58c5" podUID="e7c9cb0e-e6e5-43cd-a14e-efb10585d395" containerName="console" containerID="cri-o://ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024" gracePeriod=15 Apr 22 14:20:02.198228 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.198201 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f6648c54c-r58c5_e7c9cb0e-e6e5-43cd-a14e-efb10585d395/console/0.log" Apr 22 14:20:02.198379 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.198260 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:20:02.260684 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.260603 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-serving-cert\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.260684 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.260640 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-oauth-serving-cert\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.260684 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.260669 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-config\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.260897 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.260846 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-trusted-ca-bundle\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.260977 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.260958 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-oauth-config\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.261039 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261005 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-service-ca\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.261039 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261035 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cq9xx\" (UniqueName: \"kubernetes.io/projected/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-kube-api-access-cq9xx\") pod \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\" (UID: \"e7c9cb0e-e6e5-43cd-a14e-efb10585d395\") " Apr 22 14:20:02.261164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261005 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-config" (OuterVolumeSpecName: "console-config") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:02.261164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261035 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:02.261320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261281 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-oauth-serving-cert\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.261424 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261323 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-config\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.261424 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261333 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:02.261506 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.261467 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-service-ca" (OuterVolumeSpecName: "service-ca") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:20:02.262932 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.262907 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:20:02.263053 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.263031 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-kube-api-access-cq9xx" (OuterVolumeSpecName: "kube-api-access-cq9xx") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "kube-api-access-cq9xx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:20:02.263100 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.263076 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "e7c9cb0e-e6e5-43cd-a14e-efb10585d395" (UID: "e7c9cb0e-e6e5-43cd-a14e-efb10585d395"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:20:02.361885 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.361834 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-oauth-config\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.361885 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.361880 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-service-ca\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.362098 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.361895 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cq9xx\" (UniqueName: \"kubernetes.io/projected/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-kube-api-access-cq9xx\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.362098 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.361918 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-console-serving-cert\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.362098 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.361927 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/e7c9cb0e-e6e5-43cd-a14e-efb10585d395-trusted-ca-bundle\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:20:02.962435 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.962406 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-6f6648c54c-r58c5_e7c9cb0e-e6e5-43cd-a14e-efb10585d395/console/0.log" Apr 22 14:20:02.962860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.962447 2578 generic.go:358] "Generic (PLEG): container finished" podID="e7c9cb0e-e6e5-43cd-a14e-efb10585d395" containerID="ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024" exitCode=2 Apr 22 14:20:02.962860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.962512 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-6f6648c54c-r58c5" Apr 22 14:20:02.962860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.962519 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6648c54c-r58c5" event={"ID":"e7c9cb0e-e6e5-43cd-a14e-efb10585d395","Type":"ContainerDied","Data":"ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024"} Apr 22 14:20:02.962860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.962545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-6f6648c54c-r58c5" event={"ID":"e7c9cb0e-e6e5-43cd-a14e-efb10585d395","Type":"ContainerDied","Data":"6f99ff03ada5effb912d596280584a3967a4f6d1434265d6779932e0b750e829"} Apr 22 14:20:02.962860 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.962560 2578 scope.go:117] "RemoveContainer" containerID="ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024" Apr 22 14:20:02.971979 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.971961 2578 scope.go:117] "RemoveContainer" containerID="ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024" Apr 22 14:20:02.972236 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:20:02.972217 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024\": container with ID starting with ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024 not found: ID does not exist" containerID="ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024" Apr 22 14:20:02.972279 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.972247 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024"} err="failed to get container status \"ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024\": rpc error: code = NotFound desc = could not find container \"ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024\": container with ID starting with ce1074a18c3397d78404bd6516a8ea32c6e053b72ab7bf9e76d3f9f7bf654024 not found: ID does not exist" Apr 22 14:20:02.987176 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.987152 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-6f6648c54c-r58c5"] Apr 22 14:20:02.991157 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:02.991136 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-6f6648c54c-r58c5"] Apr 22 14:20:03.951886 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:20:03.951849 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="e7c9cb0e-e6e5-43cd-a14e-efb10585d395" path="/var/lib/kubelet/pods/e7c9cb0e-e6e5-43cd-a14e-efb10585d395/volumes" Apr 22 14:21:02.357796 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.357756 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms"] Apr 22 14:21:02.358222 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.358071 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="e7c9cb0e-e6e5-43cd-a14e-efb10585d395" containerName="console" Apr 22 14:21:02.358222 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.358082 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="e7c9cb0e-e6e5-43cd-a14e-efb10585d395" containerName="console" Apr 22 14:21:02.358222 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.358129 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="e7c9cb0e-e6e5-43cd-a14e-efb10585d395" containerName="console" Apr 22 14:21:02.362118 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.362100 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.364946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.364923 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 22 14:21:02.366044 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.366019 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 22 14:21:02.366151 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.366055 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-5pb94\"" Apr 22 14:21:02.366151 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.366060 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 22 14:21:02.372265 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.372246 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms"] Apr 22 14:21:02.435045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.435011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c87a8920-9a6d-4291-a55e-0263bc532c41-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms\" (UID: \"c87a8920-9a6d-4291-a55e-0263bc532c41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.435208 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.435061 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tqkxv\" (UniqueName: \"kubernetes.io/projected/c87a8920-9a6d-4291-a55e-0263bc532c41-kube-api-access-tqkxv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms\" (UID: \"c87a8920-9a6d-4291-a55e-0263bc532c41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.535784 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.535755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c87a8920-9a6d-4291-a55e-0263bc532c41-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms\" (UID: \"c87a8920-9a6d-4291-a55e-0263bc532c41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.535879 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.535800 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tqkxv\" (UniqueName: \"kubernetes.io/projected/c87a8920-9a6d-4291-a55e-0263bc532c41-kube-api-access-tqkxv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms\" (UID: \"c87a8920-9a6d-4291-a55e-0263bc532c41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.538104 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.538067 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/c87a8920-9a6d-4291-a55e-0263bc532c41-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms\" (UID: \"c87a8920-9a6d-4291-a55e-0263bc532c41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.543931 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.543910 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tqkxv\" (UniqueName: \"kubernetes.io/projected/c87a8920-9a6d-4291-a55e-0263bc532c41-kube-api-access-tqkxv\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms\" (UID: \"c87a8920-9a6d-4291-a55e-0263bc532c41\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.673816 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.673777 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:02.794125 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.794100 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms"] Apr 22 14:21:02.796586 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:21:02.796559 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc87a8920_9a6d_4291_a55e_0263bc532c41.slice/crio-dbad26babf919f001a3122ee94fb4d07b14ecad1dcf124a906dab6b94f009324 WatchSource:0}: Error finding container dbad26babf919f001a3122ee94fb4d07b14ecad1dcf124a906dab6b94f009324: Status 404 returned error can't find the container with id dbad26babf919f001a3122ee94fb4d07b14ecad1dcf124a906dab6b94f009324 Apr 22 14:21:02.798144 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:02.798129 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:21:03.140291 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:03.140250 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" event={"ID":"c87a8920-9a6d-4291-a55e-0263bc532c41","Type":"ContainerStarted","Data":"dbad26babf919f001a3122ee94fb4d07b14ecad1dcf124a906dab6b94f009324"} Apr 22 14:21:07.155277 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.155239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" event={"ID":"c87a8920-9a6d-4291-a55e-0263bc532c41","Type":"ContainerStarted","Data":"6af927efb14293a4ca4433c9bd48d04339928415ff5d74c600f3e35cb248cfb1"} Apr 22 14:21:07.155693 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.155373 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:07.175897 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.175840 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" podStartSLOduration=1.272063754 podStartE2EDuration="5.175822485s" podCreationTimestamp="2026-04-22 14:21:02 +0000 UTC" firstStartedPulling="2026-04-22 14:21:02.798254697 +0000 UTC m=+363.473057940" lastFinishedPulling="2026-04-22 14:21:06.702013415 +0000 UTC m=+367.376816671" observedRunningTime="2026-04-22 14:21:07.175042494 +0000 UTC m=+367.849845759" watchObservedRunningTime="2026-04-22 14:21:07.175822485 +0000 UTC m=+367.850625752" Apr 22 14:21:07.238198 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.238168 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-45r4v"] Apr 22 14:21:07.241506 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.241486 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.247716 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.247690 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-certs\"" Apr 22 14:21:07.248085 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.247729 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 22 14:21:07.248184 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.247756 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-pkkkc\"" Apr 22 14:21:07.252718 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.252690 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-45r4v"] Apr 22 14:21:07.272795 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.272765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f93c5796-dc9f-4109-bf77-2658b844be9d-cabundle0\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.272946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.272806 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.272946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.272895 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f7xgv\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-kube-api-access-f7xgv\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.373799 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.373756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f93c5796-dc9f-4109-bf77-2658b844be9d-cabundle0\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.373799 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.373804 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.374012 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.373833 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f7xgv\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-kube-api-access-f7xgv\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.374012 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.373980 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 14:21:07.374012 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.374004 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:07.374012 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.374011 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:07.374132 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.374041 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-45r4v: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:07.374132 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.374095 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates podName:f93c5796-dc9f-4109-bf77-2658b844be9d nodeName:}" failed. No retries permitted until 2026-04-22 14:21:07.874075556 +0000 UTC m=+368.548878799 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates") pod "keda-operator-ffbb595cb-45r4v" (UID: "f93c5796-dc9f-4109-bf77-2658b844be9d") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:07.374351 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.374333 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cabundle0\" (UniqueName: \"kubernetes.io/configmap/f93c5796-dc9f-4109-bf77-2658b844be9d-cabundle0\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.382930 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.382900 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f7xgv\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-kube-api-access-f7xgv\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.878399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:07.878364 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:07.878599 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.878505 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 14:21:07.878599 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.878529 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:07.878599 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.878537 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:07.878599 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.878548 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-45r4v: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:07.878599 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:07.878599 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates podName:f93c5796-dc9f-4109-bf77-2658b844be9d nodeName:}" failed. No retries permitted until 2026-04-22 14:21:08.878584814 +0000 UTC m=+369.553388058 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates") pod "keda-operator-ffbb595cb-45r4v" (UID: "f93c5796-dc9f-4109-bf77-2658b844be9d") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:08.885882 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:08.885839 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:08.886335 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:08.885984 2578 projected.go:264] Couldn't get secret openshift-keda/keda-operator-certs: secret "keda-operator-certs" not found Apr 22 14:21:08.886335 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:08.886009 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:08.886335 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:08.886016 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:08.886335 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:08.886027 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-45r4v: [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:08.886335 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:08.886088 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates podName:f93c5796-dc9f-4109-bf77-2658b844be9d nodeName:}" failed. No retries permitted until 2026-04-22 14:21:10.886071559 +0000 UTC m=+371.560874806 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates") pod "keda-operator-ffbb595cb-45r4v" (UID: "f93c5796-dc9f-4109-bf77-2658b844be9d") : [secret "keda-operator-certs" not found, references non-existent secret key: ca.crt] Apr 22 14:21:10.903095 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:10.903056 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:10.903502 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:10.903175 2578 secret.go:281] references non-existent secret key: ca.crt Apr 22 14:21:10.903502 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:10.903188 2578 projected.go:277] Couldn't get secret payload openshift-keda/kedaorg-certs: references non-existent secret key: ca.crt Apr 22 14:21:10.903502 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:10.903196 2578 projected.go:194] Error preparing data for projected volume certificates for pod openshift-keda/keda-operator-ffbb595cb-45r4v: references non-existent secret key: ca.crt Apr 22 14:21:10.903502 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:21:10.903243 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates podName:f93c5796-dc9f-4109-bf77-2658b844be9d nodeName:}" failed. No retries permitted until 2026-04-22 14:21:14.903229841 +0000 UTC m=+375.578033085 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "certificates" (UniqueName: "kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates") pod "keda-operator-ffbb595cb-45r4v" (UID: "f93c5796-dc9f-4109-bf77-2658b844be9d") : references non-existent secret key: ca.crt Apr 22 14:21:14.935790 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:14.935738 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:14.938252 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:14.938227 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/f93c5796-dc9f-4109-bf77-2658b844be9d-certificates\") pod \"keda-operator-ffbb595cb-45r4v\" (UID: \"f93c5796-dc9f-4109-bf77-2658b844be9d\") " pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:15.051665 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:15.051611 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:15.169694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:15.169619 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-operator-ffbb595cb-45r4v"] Apr 22 14:21:15.172312 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:21:15.172269 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf93c5796_dc9f_4109_bf77_2658b844be9d.slice/crio-c7791e2eb7f8ff530814bc61145c1d42f9319b5e90bf9c980aa7d020a6723795 WatchSource:0}: Error finding container c7791e2eb7f8ff530814bc61145c1d42f9319b5e90bf9c980aa7d020a6723795: Status 404 returned error can't find the container with id c7791e2eb7f8ff530814bc61145c1d42f9319b5e90bf9c980aa7d020a6723795 Apr 22 14:21:15.180633 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:15.180601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" event={"ID":"f93c5796-dc9f-4109-bf77-2658b844be9d","Type":"ContainerStarted","Data":"c7791e2eb7f8ff530814bc61145c1d42f9319b5e90bf9c980aa7d020a6723795"} Apr 22 14:21:19.193656 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:19.193620 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" event={"ID":"f93c5796-dc9f-4109-bf77-2658b844be9d","Type":"ContainerStarted","Data":"677b968aee86ccf171ccb0b414e83d5ad4183b1c1697bd8b28e5aadea8ca95fb"} Apr 22 14:21:19.194100 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:19.193769 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:21:19.212559 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:19.212509 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" podStartSLOduration=9.209303679 podStartE2EDuration="12.212494995s" podCreationTimestamp="2026-04-22 14:21:07 +0000 UTC" firstStartedPulling="2026-04-22 14:21:15.173635958 +0000 UTC m=+375.848439201" lastFinishedPulling="2026-04-22 14:21:18.176827259 +0000 UTC m=+378.851630517" observedRunningTime="2026-04-22 14:21:19.210962191 +0000 UTC m=+379.885765456" watchObservedRunningTime="2026-04-22 14:21:19.212494995 +0000 UTC m=+379.887298259" Apr 22 14:21:28.161557 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:28.161528 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-5g9ms" Apr 22 14:21:40.199830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:21:40.199798 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-operator-ffbb595cb-45r4v" Apr 22 14:22:14.960814 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:14.960734 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm"] Apr 22 14:22:14.963940 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:14.963920 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:14.967231 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:14.967200 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"kube-root-ca.crt\"" Apr 22 14:22:14.967386 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:14.967337 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager-operator\"/\"cert-manager-operator-controller-manager-dockercfg-6bnrc\"" Apr 22 14:22:14.967624 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:14.967605 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:22:14.983572 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:14.983544 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm"] Apr 22 14:22:15.014131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.014094 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9vnj8\" (UniqueName: \"kubernetes.io/projected/81fab075-f57c-45d8-aa3f-503504dec6e7-kube-api-access-9vnj8\") pod \"cert-manager-operator-controller-manager-54b9655956-hqtpm\" (UID: \"81fab075-f57c-45d8-aa3f-503504dec6e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.014376 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.014175 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81fab075-f57c-45d8-aa3f-503504dec6e7-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-hqtpm\" (UID: \"81fab075-f57c-45d8-aa3f-503504dec6e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.115770 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.115712 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9vnj8\" (UniqueName: \"kubernetes.io/projected/81fab075-f57c-45d8-aa3f-503504dec6e7-kube-api-access-9vnj8\") pod \"cert-manager-operator-controller-manager-54b9655956-hqtpm\" (UID: \"81fab075-f57c-45d8-aa3f-503504dec6e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.115965 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.115822 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81fab075-f57c-45d8-aa3f-503504dec6e7-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-hqtpm\" (UID: \"81fab075-f57c-45d8-aa3f-503504dec6e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.116332 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.116286 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/81fab075-f57c-45d8-aa3f-503504dec6e7-tmp\") pod \"cert-manager-operator-controller-manager-54b9655956-hqtpm\" (UID: \"81fab075-f57c-45d8-aa3f-503504dec6e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.125564 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.125536 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9vnj8\" (UniqueName: \"kubernetes.io/projected/81fab075-f57c-45d8-aa3f-503504dec6e7-kube-api-access-9vnj8\") pod \"cert-manager-operator-controller-manager-54b9655956-hqtpm\" (UID: \"81fab075-f57c-45d8-aa3f-503504dec6e7\") " pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.272823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.272737 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" Apr 22 14:22:15.401644 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:15.401521 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm"] Apr 22 14:22:15.404145 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:22:15.404113 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod81fab075_f57c_45d8_aa3f_503504dec6e7.slice/crio-5f18cf322becd3ce917f1d338070c3ba3ccdf761771b67a0b0b106ec75530632 WatchSource:0}: Error finding container 5f18cf322becd3ce917f1d338070c3ba3ccdf761771b67a0b0b106ec75530632: Status 404 returned error can't find the container with id 5f18cf322becd3ce917f1d338070c3ba3ccdf761771b67a0b0b106ec75530632 Apr 22 14:22:16.387384 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:16.387341 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" event={"ID":"81fab075-f57c-45d8-aa3f-503504dec6e7","Type":"ContainerStarted","Data":"5f18cf322becd3ce917f1d338070c3ba3ccdf761771b67a0b0b106ec75530632"} Apr 22 14:22:18.396128 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:18.396090 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" event={"ID":"81fab075-f57c-45d8-aa3f-503504dec6e7","Type":"ContainerStarted","Data":"15647ccd662dd92f8f32a7fc849e4070b6bf3f2c7d774e58604aff4f5f978809"} Apr 22 14:22:18.440320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:18.440243 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager-operator/cert-manager-operator-controller-manager-54b9655956-hqtpm" podStartSLOduration=1.90745959 podStartE2EDuration="4.440223102s" podCreationTimestamp="2026-04-22 14:22:14 +0000 UTC" firstStartedPulling="2026-04-22 14:22:15.406726074 +0000 UTC m=+436.081529319" lastFinishedPulling="2026-04-22 14:22:17.939489571 +0000 UTC m=+438.614292831" observedRunningTime="2026-04-22 14:22:18.438708727 +0000 UTC m=+439.113511993" watchObservedRunningTime="2026-04-22 14:22:18.440223102 +0000 UTC m=+439.115026368" Apr 22 14:22:31.511041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.510998 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-8xvzf"] Apr 22 14:22:31.514396 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.514377 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.521050 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.521027 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 22 14:22:31.521168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.521060 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-wgnxl\"" Apr 22 14:22:31.522061 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.522045 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 22 14:22:31.529286 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.529257 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-8xvzf"] Apr 22 14:22:31.646794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.646755 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ngrw9\" (UniqueName: \"kubernetes.io/projected/70c55459-25ae-4338-9d3d-0019ec500410-kube-api-access-ngrw9\") pod \"cert-manager-79c8d999ff-8xvzf\" (UID: \"70c55459-25ae-4338-9d3d-0019ec500410\") " pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.646794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.646801 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c55459-25ae-4338-9d3d-0019ec500410-bound-sa-token\") pod \"cert-manager-79c8d999ff-8xvzf\" (UID: \"70c55459-25ae-4338-9d3d-0019ec500410\") " pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.747914 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.747869 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ngrw9\" (UniqueName: \"kubernetes.io/projected/70c55459-25ae-4338-9d3d-0019ec500410-kube-api-access-ngrw9\") pod \"cert-manager-79c8d999ff-8xvzf\" (UID: \"70c55459-25ae-4338-9d3d-0019ec500410\") " pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.748095 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.747928 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c55459-25ae-4338-9d3d-0019ec500410-bound-sa-token\") pod \"cert-manager-79c8d999ff-8xvzf\" (UID: \"70c55459-25ae-4338-9d3d-0019ec500410\") " pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.757409 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.757371 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/70c55459-25ae-4338-9d3d-0019ec500410-bound-sa-token\") pod \"cert-manager-79c8d999ff-8xvzf\" (UID: \"70c55459-25ae-4338-9d3d-0019ec500410\") " pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.757549 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.757434 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ngrw9\" (UniqueName: \"kubernetes.io/projected/70c55459-25ae-4338-9d3d-0019ec500410-kube-api-access-ngrw9\") pod \"cert-manager-79c8d999ff-8xvzf\" (UID: \"70c55459-25ae-4338-9d3d-0019ec500410\") " pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.823316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.823201 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-8xvzf" Apr 22 14:22:31.948103 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:22:31.948055 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod70c55459_25ae_4338_9d3d_0019ec500410.slice/crio-cf8a70e7e97a941e257c0f27fa8d92b29dfe16dcbe3904a70f4abe4abb30216d WatchSource:0}: Error finding container cf8a70e7e97a941e257c0f27fa8d92b29dfe16dcbe3904a70f4abe4abb30216d: Status 404 returned error can't find the container with id cf8a70e7e97a941e257c0f27fa8d92b29dfe16dcbe3904a70f4abe4abb30216d Apr 22 14:22:31.952529 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:31.952507 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-8xvzf"] Apr 22 14:22:32.441537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:32.441500 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-8xvzf" event={"ID":"70c55459-25ae-4338-9d3d-0019ec500410","Type":"ContainerStarted","Data":"cf8a70e7e97a941e257c0f27fa8d92b29dfe16dcbe3904a70f4abe4abb30216d"} Apr 22 14:22:35.454073 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:35.454040 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-8xvzf" event={"ID":"70c55459-25ae-4338-9d3d-0019ec500410","Type":"ContainerStarted","Data":"158b3ae7273204e51323d77bba5f68403cfc76aa256de9e432ed57c9fba3860b"} Apr 22 14:22:35.473563 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:35.473496 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-8xvzf" podStartSLOduration=1.99099051 podStartE2EDuration="4.473477542s" podCreationTimestamp="2026-04-22 14:22:31 +0000 UTC" firstStartedPulling="2026-04-22 14:22:31.949942471 +0000 UTC m=+452.624745714" lastFinishedPulling="2026-04-22 14:22:34.432429503 +0000 UTC m=+455.107232746" observedRunningTime="2026-04-22 14:22:35.470870231 +0000 UTC m=+456.145673496" watchObservedRunningTime="2026-04-22 14:22:35.473477542 +0000 UTC m=+456.148280810" Apr 22 14:22:36.001667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.001628 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482"] Apr 22 14:22:36.004931 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.004915 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.008680 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.008654 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 22 14:22:36.009718 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.009687 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 22 14:22:36.009718 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.009693 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"openshift-lws-operator-dockercfg-46bsh\"" Apr 22 14:22:36.017867 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.017838 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482"] Apr 22 14:22:36.085556 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.085509 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zw5lm\" (UniqueName: \"kubernetes.io/projected/9fefa1c0-f5be-444c-927a-f862a64444c6-kube-api-access-zw5lm\") pod \"openshift-lws-operator-bfc7f696d-zf482\" (UID: \"9fefa1c0-f5be-444c-927a-f862a64444c6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.085733 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.085583 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fefa1c0-f5be-444c-927a-f862a64444c6-tmp\") pod \"openshift-lws-operator-bfc7f696d-zf482\" (UID: \"9fefa1c0-f5be-444c-927a-f862a64444c6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.186078 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.186038 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zw5lm\" (UniqueName: \"kubernetes.io/projected/9fefa1c0-f5be-444c-927a-f862a64444c6-kube-api-access-zw5lm\") pod \"openshift-lws-operator-bfc7f696d-zf482\" (UID: \"9fefa1c0-f5be-444c-927a-f862a64444c6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.186248 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.186114 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fefa1c0-f5be-444c-927a-f862a64444c6-tmp\") pod \"openshift-lws-operator-bfc7f696d-zf482\" (UID: \"9fefa1c0-f5be-444c-927a-f862a64444c6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.186505 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.186487 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fefa1c0-f5be-444c-927a-f862a64444c6-tmp\") pod \"openshift-lws-operator-bfc7f696d-zf482\" (UID: \"9fefa1c0-f5be-444c-927a-f862a64444c6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.195022 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.194994 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zw5lm\" (UniqueName: \"kubernetes.io/projected/9fefa1c0-f5be-444c-927a-f862a64444c6-kube-api-access-zw5lm\") pod \"openshift-lws-operator-bfc7f696d-zf482\" (UID: \"9fefa1c0-f5be-444c-927a-f862a64444c6\") " pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.313938 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.313844 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" Apr 22 14:22:36.437197 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.437170 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482"] Apr 22 14:22:36.439769 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:22:36.439738 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod9fefa1c0_f5be_444c_927a_f862a64444c6.slice/crio-c81f5aad9f68d1b26ce3bdd310ae694f518aa2f3e0301c16644e31e08a57f670 WatchSource:0}: Error finding container c81f5aad9f68d1b26ce3bdd310ae694f518aa2f3e0301c16644e31e08a57f670: Status 404 returned error can't find the container with id c81f5aad9f68d1b26ce3bdd310ae694f518aa2f3e0301c16644e31e08a57f670 Apr 22 14:22:36.458873 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:36.458843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" event={"ID":"9fefa1c0-f5be-444c-927a-f862a64444c6","Type":"ContainerStarted","Data":"c81f5aad9f68d1b26ce3bdd310ae694f518aa2f3e0301c16644e31e08a57f670"} Apr 22 14:22:39.472193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:39.472158 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" event={"ID":"9fefa1c0-f5be-444c-927a-f862a64444c6","Type":"ContainerStarted","Data":"a552c57f5a7994ea72e3cde3f20da37f5e3312a2fe2e387dcb16218e83ca9a67"} Apr 22 14:22:39.495306 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:22:39.495221 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/openshift-lws-operator-bfc7f696d-zf482" podStartSLOduration=1.813020754 podStartE2EDuration="4.495198277s" podCreationTimestamp="2026-04-22 14:22:35 +0000 UTC" firstStartedPulling="2026-04-22 14:22:36.441428299 +0000 UTC m=+457.116231558" lastFinishedPulling="2026-04-22 14:22:39.123605836 +0000 UTC m=+459.798409081" observedRunningTime="2026-04-22 14:22:39.493506676 +0000 UTC m=+460.168309942" watchObservedRunningTime="2026-04-22 14:22:39.495198277 +0000 UTC m=+460.170001542" Apr 22 14:23:08.030979 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.030939 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx"] Apr 22 14:23:08.038264 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.038240 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.043836 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.043814 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 22 14:23:08.044001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.043814 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-kx596\"" Apr 22 14:23:08.044001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.043816 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 22 14:23:08.052519 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.052494 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 22 14:23:08.055977 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.055953 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx"] Apr 22 14:23:08.150919 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.150879 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-manager-config\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.150919 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.150922 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-shtdz\" (UniqueName: \"kubernetes.io/projected/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-kube-api-access-shtdz\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.151128 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.150959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-metrics-cert\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.151128 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.150996 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-cert\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.251950 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.251913 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-cert\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.252110 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.251961 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-manager-config\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.252110 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.252050 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-shtdz\" (UniqueName: \"kubernetes.io/projected/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-kube-api-access-shtdz\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.252212 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.252112 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-metrics-cert\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.252698 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.252670 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-manager-config\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.254464 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.254441 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-cert\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.254573 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.254528 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-metrics-cert\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.286607 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.286537 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-shtdz\" (UniqueName: \"kubernetes.io/projected/5dfaf3df-2a3a-41a0-abf2-cfd57af634aa-kube-api-access-shtdz\") pod \"lws-controller-manager-879f8864c-2gqzx\" (UID: \"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa\") " pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.347692 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.347660 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:08.506385 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.506355 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx"] Apr 22 14:23:08.507738 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:23:08.507710 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5dfaf3df_2a3a_41a0_abf2_cfd57af634aa.slice/crio-546163cb8db7369465e92064103422cb6f47f16ed9d05527dde38516bd6a875c WatchSource:0}: Error finding container 546163cb8db7369465e92064103422cb6f47f16ed9d05527dde38516bd6a875c: Status 404 returned error can't find the container with id 546163cb8db7369465e92064103422cb6f47f16ed9d05527dde38516bd6a875c Apr 22 14:23:08.568591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:08.568509 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" event={"ID":"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa","Type":"ContainerStarted","Data":"546163cb8db7369465e92064103422cb6f47f16ed9d05527dde38516bd6a875c"} Apr 22 14:23:10.575907 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:10.575867 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" event={"ID":"5dfaf3df-2a3a-41a0-abf2-cfd57af634aa","Type":"ContainerStarted","Data":"1ab100484774f83a1c7b35edff127962954b9bc7e8fc4eff07cc3e9dfdf1b60b"} Apr 22 14:23:10.576390 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:10.576117 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:21.581248 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:21.581218 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" Apr 22 14:23:21.600377 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:21.600316 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-879f8864c-2gqzx" podStartSLOduration=13.029685134 podStartE2EDuration="14.600284951s" podCreationTimestamp="2026-04-22 14:23:07 +0000 UTC" firstStartedPulling="2026-04-22 14:23:08.509591023 +0000 UTC m=+489.184394266" lastFinishedPulling="2026-04-22 14:23:10.08019084 +0000 UTC m=+490.754994083" observedRunningTime="2026-04-22 14:23:10.606922862 +0000 UTC m=+491.281726126" watchObservedRunningTime="2026-04-22 14:23:21.600284951 +0000 UTC m=+502.275088216" Apr 22 14:23:30.703289 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:30.703252 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-89c5f8668-4vhtf"] Apr 22 14:23:55.722409 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.722319 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-89c5f8668-4vhtf" podUID="76b92da7-a1a7-4d46-b3af-62c10c9da34a" containerName="console" containerID="cri-o://ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d" gracePeriod=15 Apr 22 14:23:55.956229 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.956204 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-89c5f8668-4vhtf_76b92da7-a1a7-4d46-b3af-62c10c9da34a/console/0.log" Apr 22 14:23:55.956369 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.956262 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:23:55.965070 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965047 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-vjrjq\" (UniqueName: \"kubernetes.io/projected/76b92da7-a1a7-4d46-b3af-62c10c9da34a-kube-api-access-vjrjq\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965172 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965082 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-oauth-config\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965172 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965103 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-service-ca\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965172 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965131 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-oauth-serving-cert\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965172 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965151 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-serving-cert\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965179 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-trusted-ca-bundle\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965200 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-config\") pod \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\" (UID: \"76b92da7-a1a7-4d46-b3af-62c10c9da34a\") " Apr 22 14:23:55.965672 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965640 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-service-ca" (OuterVolumeSpecName: "service-ca") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:55.965672 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965655 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:55.965811 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965661 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-config" (OuterVolumeSpecName: "console-config") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:55.965811 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.965735 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 22 14:23:55.967416 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.967390 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:55.967502 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.967419 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/76b92da7-a1a7-4d46-b3af-62c10c9da34a-kube-api-access-vjrjq" (OuterVolumeSpecName: "kube-api-access-vjrjq") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "kube-api-access-vjrjq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:23:55.967502 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:55.967430 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "76b92da7-a1a7-4d46-b3af-62c10c9da34a" (UID: "76b92da7-a1a7-4d46-b3af-62c10c9da34a"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:23:56.066359 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066272 2578 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-oauth-serving-cert\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.066359 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066325 2578 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-serving-cert\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.066359 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066336 2578 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-trusted-ca-bundle\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.066359 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066345 2578 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-config\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.066359 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066354 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-vjrjq\" (UniqueName: \"kubernetes.io/projected/76b92da7-a1a7-4d46-b3af-62c10c9da34a-kube-api-access-vjrjq\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.066359 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066363 2578 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/76b92da7-a1a7-4d46-b3af-62c10c9da34a-console-oauth-config\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.066655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.066372 2578 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/76b92da7-a1a7-4d46-b3af-62c10c9da34a-service-ca\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:23:56.734428 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.734397 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-89c5f8668-4vhtf_76b92da7-a1a7-4d46-b3af-62c10c9da34a/console/0.log" Apr 22 14:23:56.734903 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.734442 2578 generic.go:358] "Generic (PLEG): container finished" podID="76b92da7-a1a7-4d46-b3af-62c10c9da34a" containerID="ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d" exitCode=2 Apr 22 14:23:56.734903 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.734535 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89c5f8668-4vhtf" event={"ID":"76b92da7-a1a7-4d46-b3af-62c10c9da34a","Type":"ContainerDied","Data":"ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d"} Apr 22 14:23:56.734903 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.734549 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-89c5f8668-4vhtf" Apr 22 14:23:56.734903 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.734577 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-89c5f8668-4vhtf" event={"ID":"76b92da7-a1a7-4d46-b3af-62c10c9da34a","Type":"ContainerDied","Data":"79f6bac61913ec2336327f4dd68589ae66be06378e48855f6a279e554109d903"} Apr 22 14:23:56.734903 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.734595 2578 scope.go:117] "RemoveContainer" containerID="ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d" Apr 22 14:23:56.744834 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.744584 2578 scope.go:117] "RemoveContainer" containerID="ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d" Apr 22 14:23:56.745003 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:23:56.744960 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d\": container with ID starting with ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d not found: ID does not exist" containerID="ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d" Apr 22 14:23:56.745077 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.744998 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d"} err="failed to get container status \"ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d\": rpc error: code = NotFound desc = could not find container \"ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d\": container with ID starting with ffdf39391dddccbed4716b5662b2a0a54ced7e01fadf302e1628b3024db0a91d not found: ID does not exist" Apr 22 14:23:56.759760 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.759728 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-89c5f8668-4vhtf"] Apr 22 14:23:56.763847 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:56.763816 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-89c5f8668-4vhtf"] Apr 22 14:23:57.952222 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:23:57.952186 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="76b92da7-a1a7-4d46-b3af-62c10c9da34a" path="/var/lib/kubelet/pods/76b92da7-a1a7-4d46-b3af-62c10c9da34a/volumes" Apr 22 14:24:03.090526 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.090488 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt"] Apr 22 14:24:03.090989 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.090825 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="76b92da7-a1a7-4d46-b3af-62c10c9da34a" containerName="console" Apr 22 14:24:03.090989 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.090836 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="76b92da7-a1a7-4d46-b3af-62c10c9da34a" containerName="console" Apr 22 14:24:03.090989 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.090895 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="76b92da7-a1a7-4d46-b3af-62c10c9da34a" containerName="console" Apr 22 14:24:03.093806 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.093786 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.096899 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.096870 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 22 14:24:03.097021 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.096936 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 22 14:24:03.097947 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.097930 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 22 14:24:03.098016 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.097930 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 22 14:24:03.098563 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.098538 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-pkxlf\"" Apr 22 14:24:03.110884 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.110855 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c959z\" (UniqueName: \"kubernetes.io/projected/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-kube-api-access-c959z\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.111029 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.110899 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.111029 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.110950 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.113310 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.113274 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt"] Apr 22 14:24:03.211426 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.211375 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c959z\" (UniqueName: \"kubernetes.io/projected/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-kube-api-access-c959z\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.211426 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.211431 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.211677 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.211479 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.211677 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:24:03.211596 2578 secret.go:189] Couldn't get secret kuadrant-system/plugin-serving-cert: secret "plugin-serving-cert" not found Apr 22 14:24:03.211762 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:24:03.211685 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-plugin-serving-cert podName:a69c2dcb-23d5-4a8f-a772-1146cb65f5be nodeName:}" failed. No retries permitted until 2026-04-22 14:24:03.711661969 +0000 UTC m=+544.386465218 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "plugin-serving-cert" (UniqueName: "kubernetes.io/secret/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-plugin-serving-cert") pod "kuadrant-console-plugin-6c886788f8-zmhlt" (UID: "a69c2dcb-23d5-4a8f-a772-1146cb65f5be") : secret "plugin-serving-cert" not found Apr 22 14:24:03.212177 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.212158 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.231087 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.231060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c959z\" (UniqueName: \"kubernetes.io/projected/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-kube-api-access-c959z\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.715460 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.715421 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:03.717952 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:03.717923 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a69c2dcb-23d5-4a8f-a772-1146cb65f5be-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-zmhlt\" (UID: \"a69c2dcb-23d5-4a8f-a772-1146cb65f5be\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:04.004099 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:04.004012 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" Apr 22 14:24:04.156546 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:04.156510 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt"] Apr 22 14:24:04.159620 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:24:04.159585 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda69c2dcb_23d5_4a8f_a772_1146cb65f5be.slice/crio-7f5e3d9cfdaec3f87cfa9387a1500aa2cb1f0741e8394c5f71b444c0ca92fa5d WatchSource:0}: Error finding container 7f5e3d9cfdaec3f87cfa9387a1500aa2cb1f0741e8394c5f71b444c0ca92fa5d: Status 404 returned error can't find the container with id 7f5e3d9cfdaec3f87cfa9387a1500aa2cb1f0741e8394c5f71b444c0ca92fa5d Apr 22 14:24:04.762121 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:04.762081 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" event={"ID":"a69c2dcb-23d5-4a8f-a772-1146cb65f5be","Type":"ContainerStarted","Data":"7f5e3d9cfdaec3f87cfa9387a1500aa2cb1f0741e8394c5f71b444c0ca92fa5d"} Apr 22 14:24:09.783645 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:09.783607 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" event={"ID":"a69c2dcb-23d5-4a8f-a772-1146cb65f5be","Type":"ContainerStarted","Data":"25f7062c952952bd91dcaa57b1ed725e20a211cb2989bfec449edc67b57efcfd"} Apr 22 14:24:09.804955 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:09.804899 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-zmhlt" podStartSLOduration=2.081885834 podStartE2EDuration="6.804882174s" podCreationTimestamp="2026-04-22 14:24:03 +0000 UTC" firstStartedPulling="2026-04-22 14:24:04.161170519 +0000 UTC m=+544.835973767" lastFinishedPulling="2026-04-22 14:24:08.884166864 +0000 UTC m=+549.558970107" observedRunningTime="2026-04-22 14:24:09.803113585 +0000 UTC m=+550.477916851" watchObservedRunningTime="2026-04-22 14:24:09.804882174 +0000 UTC m=+550.479685439" Apr 22 14:24:46.358746 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.358712 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-qzb68"] Apr 22 14:24:46.380409 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.380370 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qzb68"] Apr 22 14:24:46.380565 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.380520 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:46.383499 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.383465 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-nnl7k\"" Apr 22 14:24:46.464535 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.464498 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ndjgk\" (UniqueName: \"kubernetes.io/projected/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e-kube-api-access-ndjgk\") pod \"authorino-674b59b84c-qzb68\" (UID: \"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e\") " pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:46.565692 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.565652 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ndjgk\" (UniqueName: \"kubernetes.io/projected/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e-kube-api-access-ndjgk\") pod \"authorino-674b59b84c-qzb68\" (UID: \"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e\") " pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:46.574955 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.574922 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ndjgk\" (UniqueName: \"kubernetes.io/projected/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e-kube-api-access-ndjgk\") pod \"authorino-674b59b84c-qzb68\" (UID: \"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e\") " pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:46.621408 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.621335 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-hnk7k"] Apr 22 14:24:46.624730 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.624714 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:24:46.639488 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.639461 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-hnk7k"] Apr 22 14:24:46.689968 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.689933 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:46.766977 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.766944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v6ht2\" (UniqueName: \"kubernetes.io/projected/2bd229f4-9b03-40aa-bc12-f8b076c51400-kube-api-access-v6ht2\") pod \"authorino-79cbc94b89-hnk7k\" (UID: \"2bd229f4-9b03-40aa-bc12-f8b076c51400\") " pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:24:46.818636 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.818606 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qzb68"] Apr 22 14:24:46.823030 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:24:46.822956 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod13aa557c_c061_4eb9_a2aa_b6a19ed13c1e.slice/crio-28dfe24100643d3f69c6c10cffecd1be8933f4086d84422d1207e33f0bee6315 WatchSource:0}: Error finding container 28dfe24100643d3f69c6c10cffecd1be8933f4086d84422d1207e33f0bee6315: Status 404 returned error can't find the container with id 28dfe24100643d3f69c6c10cffecd1be8933f4086d84422d1207e33f0bee6315 Apr 22 14:24:46.868246 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.868209 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v6ht2\" (UniqueName: \"kubernetes.io/projected/2bd229f4-9b03-40aa-bc12-f8b076c51400-kube-api-access-v6ht2\") pod \"authorino-79cbc94b89-hnk7k\" (UID: \"2bd229f4-9b03-40aa-bc12-f8b076c51400\") " pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:24:46.881169 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.881123 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v6ht2\" (UniqueName: \"kubernetes.io/projected/2bd229f4-9b03-40aa-bc12-f8b076c51400-kube-api-access-v6ht2\") pod \"authorino-79cbc94b89-hnk7k\" (UID: \"2bd229f4-9b03-40aa-bc12-f8b076c51400\") " pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:24:46.906133 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.906097 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qzb68" event={"ID":"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e","Type":"ContainerStarted","Data":"28dfe24100643d3f69c6c10cffecd1be8933f4086d84422d1207e33f0bee6315"} Apr 22 14:24:46.934401 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:46.934369 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:24:47.062221 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:47.060479 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-hnk7k"] Apr 22 14:24:47.910675 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:47.910637 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" event={"ID":"2bd229f4-9b03-40aa-bc12-f8b076c51400","Type":"ContainerStarted","Data":"639607b7d276e8cb9d8ff5688fb7d5a04360856703dba2f087b594e8743e5962"} Apr 22 14:24:49.920469 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:49.920416 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" event={"ID":"2bd229f4-9b03-40aa-bc12-f8b076c51400","Type":"ContainerStarted","Data":"d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec"} Apr 22 14:24:49.938781 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:49.938724 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" podStartSLOduration=1.208866399 podStartE2EDuration="3.938707822s" podCreationTimestamp="2026-04-22 14:24:46 +0000 UTC" firstStartedPulling="2026-04-22 14:24:47.066227293 +0000 UTC m=+587.741030537" lastFinishedPulling="2026-04-22 14:24:49.796068714 +0000 UTC m=+590.470871960" observedRunningTime="2026-04-22 14:24:49.937776528 +0000 UTC m=+590.612579794" watchObservedRunningTime="2026-04-22 14:24:49.938707822 +0000 UTC m=+590.613511078" Apr 22 14:24:49.980983 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:49.980941 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qzb68"] Apr 22 14:24:51.929112 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:51.929072 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qzb68" event={"ID":"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e","Type":"ContainerStarted","Data":"5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0"} Apr 22 14:24:51.929112 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:51.929097 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-qzb68" podUID="13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" containerName="authorino" containerID="cri-o://5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0" gracePeriod=30 Apr 22 14:24:51.952778 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:51.948774 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-qzb68" podStartSLOduration=1.708317321 podStartE2EDuration="5.948751995s" podCreationTimestamp="2026-04-22 14:24:46 +0000 UTC" firstStartedPulling="2026-04-22 14:24:46.824424974 +0000 UTC m=+587.499228220" lastFinishedPulling="2026-04-22 14:24:51.064859649 +0000 UTC m=+591.739662894" observedRunningTime="2026-04-22 14:24:51.946073485 +0000 UTC m=+592.620876750" watchObservedRunningTime="2026-04-22 14:24:51.948751995 +0000 UTC m=+592.623555260" Apr 22 14:24:52.171729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.171702 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:52.317065 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.316973 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-ndjgk\" (UniqueName: \"kubernetes.io/projected/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e-kube-api-access-ndjgk\") pod \"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e\" (UID: \"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e\") " Apr 22 14:24:52.319213 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.319184 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e-kube-api-access-ndjgk" (OuterVolumeSpecName: "kube-api-access-ndjgk") pod "13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" (UID: "13aa557c-c061-4eb9-a2aa-b6a19ed13c1e"). InnerVolumeSpecName "kube-api-access-ndjgk". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:24:52.417515 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.417478 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-ndjgk\" (UniqueName: \"kubernetes.io/projected/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e-kube-api-access-ndjgk\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:24:52.933737 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.933703 2578 generic.go:358] "Generic (PLEG): container finished" podID="13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" containerID="5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0" exitCode=0 Apr 22 14:24:52.934149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.933752 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-qzb68" Apr 22 14:24:52.934149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.933789 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qzb68" event={"ID":"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e","Type":"ContainerDied","Data":"5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0"} Apr 22 14:24:52.934149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.933825 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-qzb68" event={"ID":"13aa557c-c061-4eb9-a2aa-b6a19ed13c1e","Type":"ContainerDied","Data":"28dfe24100643d3f69c6c10cffecd1be8933f4086d84422d1207e33f0bee6315"} Apr 22 14:24:52.934149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.933841 2578 scope.go:117] "RemoveContainer" containerID="5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0" Apr 22 14:24:52.941920 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.941903 2578 scope.go:117] "RemoveContainer" containerID="5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0" Apr 22 14:24:52.942174 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:24:52.942157 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0\": container with ID starting with 5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0 not found: ID does not exist" containerID="5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0" Apr 22 14:24:52.942242 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.942182 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0"} err="failed to get container status \"5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0\": rpc error: code = NotFound desc = could not find container \"5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0\": container with ID starting with 5bc572fb71b878c0a5e688c5838cd989bfa08d3fe98e793056f1a98df14e3bd0 not found: ID does not exist" Apr 22 14:24:52.957937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.957906 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qzb68"] Apr 22 14:24:52.959914 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:52.959887 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-qzb68"] Apr 22 14:24:53.951598 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:53.951566 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" path="/var/lib/kubelet/pods/13aa557c-c061-4eb9-a2aa-b6a19ed13c1e/volumes" Apr 22 14:24:59.858024 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:59.857994 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:24:59.858817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:24:59.858798 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:25:11.558479 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.558396 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-jr972"] Apr 22 14:25:11.560923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.558755 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" containerName="authorino" Apr 22 14:25:11.560923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.558767 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" containerName="authorino" Apr 22 14:25:11.560923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.558828 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="13aa557c-c061-4eb9-a2aa-b6a19ed13c1e" containerName="authorino" Apr 22 14:25:11.561778 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.561763 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.565075 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.565054 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 22 14:25:11.573544 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.573518 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-jr972"] Apr 22 14:25:11.578824 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.578802 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zzkxn\" (UniqueName: \"kubernetes.io/projected/cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9-kube-api-access-zzkxn\") pod \"authorino-68bd676465-jr972\" (UID: \"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9\") " pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.578959 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.578838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9-tls-cert\") pod \"authorino-68bd676465-jr972\" (UID: \"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9\") " pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.679281 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.679242 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zzkxn\" (UniqueName: \"kubernetes.io/projected/cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9-kube-api-access-zzkxn\") pod \"authorino-68bd676465-jr972\" (UID: \"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9\") " pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.679490 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.679321 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9-tls-cert\") pod \"authorino-68bd676465-jr972\" (UID: \"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9\") " pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.681940 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.681909 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9-tls-cert\") pod \"authorino-68bd676465-jr972\" (UID: \"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9\") " pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.689772 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.689749 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zzkxn\" (UniqueName: \"kubernetes.io/projected/cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9-kube-api-access-zzkxn\") pod \"authorino-68bd676465-jr972\" (UID: \"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9\") " pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.870569 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.870465 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-jr972" Apr 22 14:25:11.996820 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:11.996796 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-jr972"] Apr 22 14:25:11.999369 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:25:11.999341 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podcc092e41_b40d_4bb3_bdcf_d2bc2a3cc3c9.slice/crio-46bcdf9f7324632da59ab32b22ad1621b4b41d3a15b4771d576b886d16c0a3fc WatchSource:0}: Error finding container 46bcdf9f7324632da59ab32b22ad1621b4b41d3a15b4771d576b886d16c0a3fc: Status 404 returned error can't find the container with id 46bcdf9f7324632da59ab32b22ad1621b4b41d3a15b4771d576b886d16c0a3fc Apr 22 14:25:13.002342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.002291 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-jr972" event={"ID":"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9","Type":"ContainerStarted","Data":"9a625c458a96a3c4f6e0ab72515233b520a917541c6c762c22dc531e8111ef4d"} Apr 22 14:25:13.002342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.002343 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-jr972" event={"ID":"cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9","Type":"ContainerStarted","Data":"46bcdf9f7324632da59ab32b22ad1621b4b41d3a15b4771d576b886d16c0a3fc"} Apr 22 14:25:13.020474 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.020423 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-jr972" podStartSLOduration=1.57629496 podStartE2EDuration="2.020409329s" podCreationTimestamp="2026-04-22 14:25:11 +0000 UTC" firstStartedPulling="2026-04-22 14:25:12.000514769 +0000 UTC m=+612.675318024" lastFinishedPulling="2026-04-22 14:25:12.444629137 +0000 UTC m=+613.119432393" observedRunningTime="2026-04-22 14:25:13.019715325 +0000 UTC m=+613.694518590" watchObservedRunningTime="2026-04-22 14:25:13.020409329 +0000 UTC m=+613.695212594" Apr 22 14:25:13.052402 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.052363 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-hnk7k"] Apr 22 14:25:13.052681 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.052623 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" podUID="2bd229f4-9b03-40aa-bc12-f8b076c51400" containerName="authorino" containerID="cri-o://d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec" gracePeriod=30 Apr 22 14:25:13.300626 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.300603 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:25:13.395485 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.395445 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v6ht2\" (UniqueName: \"kubernetes.io/projected/2bd229f4-9b03-40aa-bc12-f8b076c51400-kube-api-access-v6ht2\") pod \"2bd229f4-9b03-40aa-bc12-f8b076c51400\" (UID: \"2bd229f4-9b03-40aa-bc12-f8b076c51400\") " Apr 22 14:25:13.397544 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.397515 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2bd229f4-9b03-40aa-bc12-f8b076c51400-kube-api-access-v6ht2" (OuterVolumeSpecName: "kube-api-access-v6ht2") pod "2bd229f4-9b03-40aa-bc12-f8b076c51400" (UID: "2bd229f4-9b03-40aa-bc12-f8b076c51400"). InnerVolumeSpecName "kube-api-access-v6ht2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:25:13.496576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:13.496541 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v6ht2\" (UniqueName: \"kubernetes.io/projected/2bd229f4-9b03-40aa-bc12-f8b076c51400-kube-api-access-v6ht2\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:25:14.006599 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.006558 2578 generic.go:358] "Generic (PLEG): container finished" podID="2bd229f4-9b03-40aa-bc12-f8b076c51400" containerID="d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec" exitCode=0 Apr 22 14:25:14.006599 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.006596 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" event={"ID":"2bd229f4-9b03-40aa-bc12-f8b076c51400","Type":"ContainerDied","Data":"d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec"} Apr 22 14:25:14.007101 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.006616 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" Apr 22 14:25:14.007101 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.006636 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-hnk7k" event={"ID":"2bd229f4-9b03-40aa-bc12-f8b076c51400","Type":"ContainerDied","Data":"639607b7d276e8cb9d8ff5688fb7d5a04360856703dba2f087b594e8743e5962"} Apr 22 14:25:14.007101 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.006659 2578 scope.go:117] "RemoveContainer" containerID="d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec" Apr 22 14:25:14.014755 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.014733 2578 scope.go:117] "RemoveContainer" containerID="d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec" Apr 22 14:25:14.015017 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:25:14.014996 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec\": container with ID starting with d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec not found: ID does not exist" containerID="d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec" Apr 22 14:25:14.015066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.015027 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec"} err="failed to get container status \"d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec\": rpc error: code = NotFound desc = could not find container \"d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec\": container with ID starting with d305820b4b533538084211e83a98695c990c5eb4094b7f7f931300b7e02c4dec not found: ID does not exist" Apr 22 14:25:14.026695 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.026666 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-hnk7k"] Apr 22 14:25:14.029936 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:14.029912 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-hnk7k"] Apr 22 14:25:15.951914 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:25:15.951881 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2bd229f4-9b03-40aa-bc12-f8b076c51400" path="/var/lib/kubelet/pods/2bd229f4-9b03-40aa-bc12-f8b076c51400/volumes" Apr 22 14:27:35.795918 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.795881 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h"] Apr 22 14:27:35.796503 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.796365 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2bd229f4-9b03-40aa-bc12-f8b076c51400" containerName="authorino" Apr 22 14:27:35.796503 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.796385 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="2bd229f4-9b03-40aa-bc12-f8b076c51400" containerName="authorino" Apr 22 14:27:35.796503 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.796469 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="2bd229f4-9b03-40aa-bc12-f8b076c51400" containerName="authorino" Apr 22 14:27:35.799473 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.799452 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.802907 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:35.802882 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"default-dockercfg-q979b\" is forbidden: User \"system:node:ip-10-0-132-130.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-132-130.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q979b\"" type="*v1.Secret" Apr 22 14:27:35.803006 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:35.802949 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"kube-root-ca.crt\" is forbidden: User \"system:node:ip-10-0-132-130.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-132-130.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" type="*v1.ConfigMap" Apr 22 14:27:35.803531 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:35.803511 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"scheduler-configmap-ref-test-kserve-self-signed-certs\" is forbidden: User \"system:node:ip-10-0-132-130.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-132-130.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" type="*v1.Secret" Apr 22 14:27:35.803584 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.803556 2578 status_manager.go:895] "Failed to get status for pod" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" err="pods \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" is forbidden: User \"system:node:ip-10-0-132-130.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-132-130.ec2.internal' and this object" Apr 22 14:27:35.803993 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:35.803975 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.Secret: secrets \"scheduler-configmap-ref-test-epp-sa-dockercfg-hx585\" is forbidden: User \"system:node:ip-10-0-132-130.ec2.internal\" cannot list resource \"secrets\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-132-130.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-hx585\"" type="*v1.Secret" Apr 22 14:27:35.804600 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:35.804578 2578 reflector.go:200] "Failed to watch" err="failed to list *v1.ConfigMap: configmaps \"openshift-service-ca.crt\" is forbidden: User \"system:node:ip-10-0-132-130.ec2.internal\" cannot list resource \"configmaps\" in API group \"\" in the namespace \"kserve-ci-e2e-test\": no relationship found between node 'ip-10-0-132-130.ec2.internal' and this object" logger="UnhandledError" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" type="*v1.ConfigMap" Apr 22 14:27:35.819391 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.819365 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h"] Apr 22 14:27:35.860161 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.860126 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.860161 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.860163 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rcx6z\" (UniqueName: \"kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.860413 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.860185 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.860413 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.860206 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.860413 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.860272 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.860413 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.860345 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961052 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961012 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcx6z\" (UniqueName: \"kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961101 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961140 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961187 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961257 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961628 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kserve-provision-location\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961769 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961691 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-uds\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961769 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-tmp\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:35.961769 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:35.961747 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-cache\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:36.786001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:36.785963 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-kserve-self-signed-certs\"" Apr 22 14:27:36.794283 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:36.794246 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tls-certs\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:36.968649 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:36.968614 2578 projected.go:289] Couldn't get configMap kserve-ci-e2e-test/kube-root-ca.crt: failed to sync configmap cache: timed out waiting for the condition Apr 22 14:27:37.055835 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.055746 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-configmap-ref-test-epp-sa-dockercfg-hx585\"" Apr 22 14:27:37.146046 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.146009 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q979b\"" Apr 22 14:27:37.264415 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.264378 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:27:37.269705 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:37.269689 2578 projected.go:194] Error preparing data for projected volume kube-api-access-rcx6z for pod kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h: failed to sync configmap cache: timed out waiting for the condition Apr 22 14:27:37.269784 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:27:37.269774 2578 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z podName:8a14fac9-cee9-4d51-a8c6-a43e78a251b3 nodeName:}" failed. No retries permitted until 2026-04-22 14:27:37.769749005 +0000 UTC m=+758.444552253 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-rcx6z" (UniqueName: "kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z") pod "scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3") : failed to sync configmap cache: timed out waiting for the condition Apr 22 14:27:37.405735 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.405700 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 14:27:37.777324 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.777245 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rcx6z\" (UniqueName: \"kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:37.779805 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.779771 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rcx6z\" (UniqueName: \"kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z\") pod \"scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:37.908098 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:37.908059 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:27:38.035435 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:38.035396 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h"] Apr 22 14:27:38.038319 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:27:38.038274 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8a14fac9_cee9_4d51_a8c6_a43e78a251b3.slice/crio-78ffd215ec594a5b198f4d63ca2831dd0bf34de142a96993600959e49495de1e WatchSource:0}: Error finding container 78ffd215ec594a5b198f4d63ca2831dd0bf34de142a96993600959e49495de1e: Status 404 returned error can't find the container with id 78ffd215ec594a5b198f4d63ca2831dd0bf34de142a96993600959e49495de1e Apr 22 14:27:38.040192 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:38.040162 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:27:38.481481 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:38.481447 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerStarted","Data":"78ffd215ec594a5b198f4d63ca2831dd0bf34de142a96993600959e49495de1e"} Apr 22 14:27:41.494038 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:41.493997 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerStarted","Data":"f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc"} Apr 22 14:27:42.499760 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:42.499665 2578 generic.go:358] "Generic (PLEG): container finished" podID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerID="f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc" exitCode=0 Apr 22 14:27:42.500119 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:42.499757 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerDied","Data":"f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc"} Apr 22 14:27:44.509904 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:27:44.509862 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerStarted","Data":"b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0"} Apr 22 14:28:14.625063 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:14.625026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerStarted","Data":"3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26"} Apr 22 14:28:14.625546 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:14.625253 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:14.627886 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:14.627865 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:14.650280 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:14.650195 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" podStartSLOduration=3.891505147 podStartE2EDuration="39.650177922s" podCreationTimestamp="2026-04-22 14:27:35 +0000 UTC" firstStartedPulling="2026-04-22 14:27:38.040366469 +0000 UTC m=+758.715169713" lastFinishedPulling="2026-04-22 14:28:13.799039236 +0000 UTC m=+794.473842488" observedRunningTime="2026-04-22 14:28:14.648379366 +0000 UTC m=+795.323182631" watchObservedRunningTime="2026-04-22 14:28:14.650177922 +0000 UTC m=+795.324981187" Apr 22 14:28:17.908592 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:17.908544 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:17.908978 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:17.908604 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:27.910810 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:27.910772 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:27.912027 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:27.912000 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:29.003323 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:29.003265 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h"] Apr 22 14:28:29.679807 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:29.679765 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="main" containerID="cri-o://b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0" gracePeriod=30 Apr 22 14:28:29.680026 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:29.679799 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="tokenizer" containerID="cri-o://3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26" gracePeriod=30 Apr 22 14:28:30.684503 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:30.684467 2578 generic.go:358] "Generic (PLEG): container finished" podID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerID="b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0" exitCode=0 Apr 22 14:28:30.684866 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:30.684536 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerDied","Data":"b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0"} Apr 22 14:28:31.022169 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.022145 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:31.060146 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060116 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tls-certs\") pod \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " Apr 22 14:28:31.060342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060155 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kserve-provision-location\") pod \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " Apr 22 14:28:31.060342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060178 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-uds\") pod \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " Apr 22 14:28:31.060342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060206 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rcx6z\" (UniqueName: \"kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z\") pod \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " Apr 22 14:28:31.060342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060264 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-tmp\") pod \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " Apr 22 14:28:31.060342 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060329 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-cache\") pod \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\" (UID: \"8a14fac9-cee9-4d51-a8c6-a43e78a251b3\") " Apr 22 14:28:31.060655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060620 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "8a14fac9-cee9-4d51-a8c6-a43e78a251b3" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:31.060716 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060632 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "8a14fac9-cee9-4d51-a8c6-a43e78a251b3" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:31.060857 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.060837 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "8a14fac9-cee9-4d51-a8c6-a43e78a251b3" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:31.061101 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.061073 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8a14fac9-cee9-4d51-a8c6-a43e78a251b3" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:28:31.062645 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.062611 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8a14fac9-cee9-4d51-a8c6-a43e78a251b3" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:28:31.063014 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.062996 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z" (OuterVolumeSpecName: "kube-api-access-rcx6z") pod "8a14fac9-cee9-4d51-a8c6-a43e78a251b3" (UID: "8a14fac9-cee9-4d51-a8c6-a43e78a251b3"). InnerVolumeSpecName "kube-api-access-rcx6z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:28:31.161626 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.161587 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:28:31.161626 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.161620 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:28:31.161626 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.161630 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:28:31.161857 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.161639 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:28:31.161857 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.161650 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rcx6z\" (UniqueName: \"kubernetes.io/projected/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-kube-api-access-rcx6z\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:28:31.161857 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.161659 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/8a14fac9-cee9-4d51-a8c6-a43e78a251b3-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:28:31.689337 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.689282 2578 generic.go:358] "Generic (PLEG): container finished" podID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerID="3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26" exitCode=0 Apr 22 14:28:31.689337 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.689330 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerDied","Data":"3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26"} Apr 22 14:28:31.689817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.689369 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" Apr 22 14:28:31.689817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.689384 2578 scope.go:117] "RemoveContainer" containerID="3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26" Apr 22 14:28:31.689817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.689373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h" event={"ID":"8a14fac9-cee9-4d51-a8c6-a43e78a251b3","Type":"ContainerDied","Data":"78ffd215ec594a5b198f4d63ca2831dd0bf34de142a96993600959e49495de1e"} Apr 22 14:28:31.698766 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.698748 2578 scope.go:117] "RemoveContainer" containerID="b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0" Apr 22 14:28:31.706337 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.706316 2578 scope.go:117] "RemoveContainer" containerID="f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc" Apr 22 14:28:31.713500 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.713476 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h"] Apr 22 14:28:31.714188 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.714172 2578 scope.go:117] "RemoveContainer" containerID="3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26" Apr 22 14:28:31.714615 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:28:31.714594 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26\": container with ID starting with 3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26 not found: ID does not exist" containerID="3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26" Apr 22 14:28:31.714692 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.714622 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26"} err="failed to get container status \"3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26\": rpc error: code = NotFound desc = could not find container \"3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26\": container with ID starting with 3a925714f86faebe1db67a5e329f412c9e147692eeed837bc429d23faa29fe26 not found: ID does not exist" Apr 22 14:28:31.714692 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.714641 2578 scope.go:117] "RemoveContainer" containerID="b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0" Apr 22 14:28:31.714844 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:28:31.714828 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0\": container with ID starting with b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0 not found: ID does not exist" containerID="b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0" Apr 22 14:28:31.714902 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.714847 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0"} err="failed to get container status \"b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0\": rpc error: code = NotFound desc = could not find container \"b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0\": container with ID starting with b42447e8271bde3315147b51ac24e44ee125308977e91b2c087cf97a1a28b5e0 not found: ID does not exist" Apr 22 14:28:31.714902 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.714859 2578 scope.go:117] "RemoveContainer" containerID="f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc" Apr 22 14:28:31.715060 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:28:31.715046 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc\": container with ID starting with f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc not found: ID does not exist" containerID="f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc" Apr 22 14:28:31.715101 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.715062 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc"} err="failed to get container status \"f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc\": rpc error: code = NotFound desc = could not find container \"f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc\": container with ID starting with f606f6db17b5736b981d4c05cfc6360c415c0daefcd0fad8c4642755008407bc not found: ID does not exist" Apr 22 14:28:31.717900 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.717876 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-configmap-ref-test-kserve-router-scheduler-79cdchjh7h"] Apr 22 14:28:31.951877 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:31.951797 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" path="/var/lib/kubelet/pods/8a14fac9-cee9-4d51-a8c6-a43e78a251b3/volumes" Apr 22 14:28:51.842683 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.842649 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx"] Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.842965 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="main" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.842976 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="main" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.842986 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="tokenizer" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.842992 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="tokenizer" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.843010 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="storage-initializer" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.843016 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="storage-initializer" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.843072 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="main" Apr 22 14:28:51.843134 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.843079 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8a14fac9-cee9-4d51-a8c6-a43e78a251b3" containerName="tokenizer" Apr 22 14:28:51.856013 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.855981 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:51.858582 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.858555 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx"] Apr 22 14:28:51.859320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.859284 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:28:51.859443 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.859286 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q979b\"" Apr 22 14:28:51.860113 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.860098 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 22 14:28:51.860183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.860121 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 14:28:51.931525 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.931489 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:51.931744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.931547 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2fb08a8-0845-424e-b738-aba3f384d02c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:51.931744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.931615 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-dshm\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:51.931744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.931646 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-phtn5\" (UniqueName: \"kubernetes.io/projected/b2fb08a8-0845-424e-b738-aba3f384d02c-kube-api-access-phtn5\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:51.931744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.931694 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-home\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:51.931744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:51.931716 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.032431 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032377 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-dshm\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.032637 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032440 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-phtn5\" (UniqueName: \"kubernetes.io/projected/b2fb08a8-0845-424e-b738-aba3f384d02c-kube-api-access-phtn5\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.032637 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032502 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-home\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.032637 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032525 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.032806 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032670 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.032806 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032730 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2fb08a8-0845-424e-b738-aba3f384d02c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.033261 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.032901 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-home\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.033261 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.033215 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.033261 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.033223 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-model-cache\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.034784 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.034765 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-dshm\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.035131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.035110 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2fb08a8-0845-424e-b738-aba3f384d02c-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.042157 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.042132 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-phtn5\" (UniqueName: \"kubernetes.io/projected/b2fb08a8-0845-424e-b738-aba3f384d02c-kube-api-access-phtn5\") pod \"scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.117211 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.117123 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx"] Apr 22 14:28:52.121178 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.121154 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.124881 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.124861 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-cjn9l\"" Apr 22 14:28:52.132193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.132171 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx"] Apr 22 14:28:52.168848 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.168808 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:28:52.234918 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.234886 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.235062 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.234926 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.235062 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.234942 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.235062 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.234959 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-w7mpm\" (UniqueName: \"kubernetes.io/projected/ee182599-06bd-406b-850e-1a032889da18-kube-api-access-w7mpm\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.235062 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.234987 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.235062 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.235033 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee182599-06bd-406b-850e-1a032889da18-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.296338 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.296293 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx"] Apr 22 14:28:52.299090 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:28:52.299054 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb2fb08a8_0845_424e_b738_aba3f384d02c.slice/crio-d784ddb96207a41fc937721e718f132e73b50ec749b2c4745777cefc17fb8965 WatchSource:0}: Error finding container d784ddb96207a41fc937721e718f132e73b50ec749b2c4745777cefc17fb8965: Status 404 returned error can't find the container with id d784ddb96207a41fc937721e718f132e73b50ec749b2c4745777cefc17fb8965 Apr 22 14:28:52.335472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335438 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee182599-06bd-406b-850e-1a032889da18-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.335632 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335489 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.335632 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335512 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.335632 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335528 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.335632 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-w7mpm\" (UniqueName: \"kubernetes.io/projected/ee182599-06bd-406b-850e-1a032889da18-kube-api-access-w7mpm\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.335632 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335579 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.335986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.335958 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.336093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.336010 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.336093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.336028 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.336093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.336067 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.337970 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.337949 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee182599-06bd-406b-850e-1a032889da18-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.344878 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.344853 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-w7mpm\" (UniqueName: \"kubernetes.io/projected/ee182599-06bd-406b-850e-1a032889da18-kube-api-access-w7mpm\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.431521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.431478 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:52.565710 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.565682 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx"] Apr 22 14:28:52.568131 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:28:52.568102 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podee182599_06bd_406b_850e_1a032889da18.slice/crio-f259a7fcfc1df35dabd489093c2492e6fd07fc5b4f62fe228a7170fb669dca8c WatchSource:0}: Error finding container f259a7fcfc1df35dabd489093c2492e6fd07fc5b4f62fe228a7170fb669dca8c: Status 404 returned error can't find the container with id f259a7fcfc1df35dabd489093c2492e6fd07fc5b4f62fe228a7170fb669dca8c Apr 22 14:28:52.762233 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.762190 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" event={"ID":"b2fb08a8-0845-424e-b738-aba3f384d02c","Type":"ContainerStarted","Data":"c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294"} Apr 22 14:28:52.762233 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.762239 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" event={"ID":"b2fb08a8-0845-424e-b738-aba3f384d02c","Type":"ContainerStarted","Data":"d784ddb96207a41fc937721e718f132e73b50ec749b2c4745777cefc17fb8965"} Apr 22 14:28:52.763805 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.763772 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerStarted","Data":"f9d366ed03fe9ae3047722b37ecfb9a849475748332d08496cb1668b8e651c62"} Apr 22 14:28:52.763955 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:52.763812 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerStarted","Data":"f259a7fcfc1df35dabd489093c2492e6fd07fc5b4f62fe228a7170fb669dca8c"} Apr 22 14:28:53.768659 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:53.768568 2578 generic.go:358] "Generic (PLEG): container finished" podID="ee182599-06bd-406b-850e-1a032889da18" containerID="f9d366ed03fe9ae3047722b37ecfb9a849475748332d08496cb1668b8e651c62" exitCode=0 Apr 22 14:28:53.769197 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:53.768654 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerDied","Data":"f9d366ed03fe9ae3047722b37ecfb9a849475748332d08496cb1668b8e651c62"} Apr 22 14:28:54.774355 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:54.774293 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerStarted","Data":"70199e506da8797d9448559bbad932d8e71a9f78ca1bb0deae485037303e51ce"} Apr 22 14:28:54.774355 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:54.774357 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerStarted","Data":"48a1e5111e6b4f2bb95c2c2d3ad00aecfe0316f2a4258d52fde3225cf9416ae6"} Apr 22 14:28:54.774897 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:54.774447 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:28:54.799375 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:54.799313 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" podStartSLOduration=2.799279331 podStartE2EDuration="2.799279331s" podCreationTimestamp="2026-04-22 14:28:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:28:54.796870307 +0000 UTC m=+835.471673572" watchObservedRunningTime="2026-04-22 14:28:54.799279331 +0000 UTC m=+835.474082596" Apr 22 14:28:57.789812 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:57.789777 2578 generic.go:358] "Generic (PLEG): container finished" podID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerID="c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294" exitCode=0 Apr 22 14:28:57.790289 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:57.789859 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" event={"ID":"b2fb08a8-0845-424e-b738-aba3f384d02c","Type":"ContainerDied","Data":"c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294"} Apr 22 14:28:59.798694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:59.798659 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" event={"ID":"b2fb08a8-0845-424e-b738-aba3f384d02c","Type":"ContainerStarted","Data":"36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b"} Apr 22 14:28:59.822252 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:28:59.822185 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" podStartSLOduration=7.623283613 podStartE2EDuration="8.822165383s" podCreationTimestamp="2026-04-22 14:28:51 +0000 UTC" firstStartedPulling="2026-04-22 14:28:57.791065301 +0000 UTC m=+838.465868543" lastFinishedPulling="2026-04-22 14:28:58.989947066 +0000 UTC m=+839.664750313" observedRunningTime="2026-04-22 14:28:59.819521991 +0000 UTC m=+840.494325269" watchObservedRunningTime="2026-04-22 14:28:59.822165383 +0000 UTC m=+840.496968649" Apr 22 14:29:02.169053 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.169014 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:29:02.169053 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.169064 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:29:02.181633 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.181604 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:29:02.432056 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.431968 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:02.432208 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.432126 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:02.434944 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.434916 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:02.815391 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.815310 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:02.825560 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:02.825531 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:29:16.233977 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.233940 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm"] Apr 22 14:29:16.237782 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.237755 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.241827 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.241792 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-55f7ae4a-epp-sa-dockercfg-sxqtz\"" Apr 22 14:29:16.242830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.242810 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 22 14:29:16.249205 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.249177 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm"] Apr 22 14:29:16.337115 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.337066 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.337288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.337148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sqrw6\" (UniqueName: \"kubernetes.io/projected/6d98b2ec-35eb-4123-b664-797da96d3339-kube-api-access-sqrw6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.337288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.337196 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.337288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.337268 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.337416 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.337324 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.337416 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.337368 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d98b2ec-35eb-4123-b664-797da96d3339-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439029 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.438949 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439227 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.439129 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sqrw6\" (UniqueName: \"kubernetes.io/projected/6d98b2ec-35eb-4123-b664-797da96d3339-kube-api-access-sqrw6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439286 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.439236 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439366 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.439293 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439366 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.439355 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439470 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.439411 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d98b2ec-35eb-4123-b664-797da96d3339-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.439470 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.439415 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-tmp\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.444153 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.442694 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d98b2ec-35eb-4123-b664-797da96d3339-tls-certs\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.444153 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.442954 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-kserve-provision-location\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.444153 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.443597 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-uds\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.444153 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.443827 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-cache\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.452115 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.452085 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sqrw6\" (UniqueName: \"kubernetes.io/projected/6d98b2ec-35eb-4123-b664-797da96d3339-kube-api-access-sqrw6\") pod \"llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.549136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.549038 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:16.677775 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.677744 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm"] Apr 22 14:29:16.679031 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:29:16.679005 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6d98b2ec_35eb_4123_b664_797da96d3339.slice/crio-a5a9fa76177f8c3c7be32cee176dc168a89e3339069d4bb455a058ec789c5faf WatchSource:0}: Error finding container a5a9fa76177f8c3c7be32cee176dc168a89e3339069d4bb455a058ec789c5faf: Status 404 returned error can't find the container with id a5a9fa76177f8c3c7be32cee176dc168a89e3339069d4bb455a058ec789c5faf Apr 22 14:29:16.862786 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.862697 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerStarted","Data":"2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd"} Apr 22 14:29:16.862786 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:16.862748 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerStarted","Data":"a5a9fa76177f8c3c7be32cee176dc168a89e3339069d4bb455a058ec789c5faf"} Apr 22 14:29:17.867254 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:17.867158 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d98b2ec-35eb-4123-b664-797da96d3339" containerID="2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd" exitCode=0 Apr 22 14:29:17.867659 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:17.867245 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerDied","Data":"2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd"} Apr 22 14:29:18.873435 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:18.873394 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerStarted","Data":"7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414"} Apr 22 14:29:18.873435 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:18.873432 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerStarted","Data":"8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33"} Apr 22 14:29:18.873950 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:18.873494 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:18.898373 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:18.898270 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" podStartSLOduration=2.8982504049999998 podStartE2EDuration="2.898250405s" podCreationTimestamp="2026-04-22 14:29:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:29:18.896831155 +0000 UTC m=+859.571634433" watchObservedRunningTime="2026-04-22 14:29:18.898250405 +0000 UTC m=+859.573053671" Apr 22 14:29:24.821894 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:24.821864 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:26.549711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:26.549671 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:26.549711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:26.549712 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:26.552577 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:26.552544 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:26.900818 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:26.900793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:38.693569 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.693537 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx"] Apr 22 14:29:38.694059 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.693846 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="main" containerID="cri-o://48a1e5111e6b4f2bb95c2c2d3ad00aecfe0316f2a4258d52fde3225cf9416ae6" gracePeriod=30 Apr 22 14:29:38.694059 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.693907 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="tokenizer" containerID="cri-o://70199e506da8797d9448559bbad932d8e71a9f78ca1bb0deae485037303e51ce" gracePeriod=30 Apr 22 14:29:38.699262 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.699235 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx"] Apr 22 14:29:38.699571 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.699548 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerName="main" containerID="cri-o://36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b" gracePeriod=30 Apr 22 14:29:38.938413 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.938384 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:29:38.942158 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.942126 2578 generic.go:358] "Generic (PLEG): container finished" podID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerID="36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b" exitCode=0 Apr 22 14:29:38.942350 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.942211 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" event={"ID":"b2fb08a8-0845-424e-b738-aba3f384d02c","Type":"ContainerDied","Data":"36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b"} Apr 22 14:29:38.942350 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.942253 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" event={"ID":"b2fb08a8-0845-424e-b738-aba3f384d02c","Type":"ContainerDied","Data":"d784ddb96207a41fc937721e718f132e73b50ec749b2c4745777cefc17fb8965"} Apr 22 14:29:38.942350 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.942222 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx" Apr 22 14:29:38.942350 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.942279 2578 scope.go:117] "RemoveContainer" containerID="36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b" Apr 22 14:29:38.944394 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.944286 2578 generic.go:358] "Generic (PLEG): container finished" podID="ee182599-06bd-406b-850e-1a032889da18" containerID="48a1e5111e6b4f2bb95c2c2d3ad00aecfe0316f2a4258d52fde3225cf9416ae6" exitCode=0 Apr 22 14:29:38.944394 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.944370 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerDied","Data":"48a1e5111e6b4f2bb95c2c2d3ad00aecfe0316f2a4258d52fde3225cf9416ae6"} Apr 22 14:29:38.951164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:38.951146 2578 scope.go:117] "RemoveContainer" containerID="c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294" Apr 22 14:29:39.015876 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.015851 2578 scope.go:117] "RemoveContainer" containerID="36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b" Apr 22 14:29:39.016203 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:29:39.016186 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b\": container with ID starting with 36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b not found: ID does not exist" containerID="36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b" Apr 22 14:29:39.016266 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.016213 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b"} err="failed to get container status \"36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b\": rpc error: code = NotFound desc = could not find container \"36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b\": container with ID starting with 36addf1ed8fbc042435909c799988bcaab95a686f4d7c728187e15063342fb5b not found: ID does not exist" Apr 22 14:29:39.016266 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.016231 2578 scope.go:117] "RemoveContainer" containerID="c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294" Apr 22 14:29:39.016471 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:29:39.016449 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294\": container with ID starting with c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294 not found: ID does not exist" containerID="c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294" Apr 22 14:29:39.016513 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.016477 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294"} err="failed to get container status \"c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294\": rpc error: code = NotFound desc = could not find container \"c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294\": container with ID starting with c1e79e469370cce651f18b09510667b1aa1d5ffb2fbec171d710a4bb03596294 not found: ID does not exist" Apr 22 14:29:39.037901 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.037866 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-dshm\") pod \"b2fb08a8-0845-424e-b738-aba3f384d02c\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " Apr 22 14:29:39.038006 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.037949 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-phtn5\" (UniqueName: \"kubernetes.io/projected/b2fb08a8-0845-424e-b738-aba3f384d02c-kube-api-access-phtn5\") pod \"b2fb08a8-0845-424e-b738-aba3f384d02c\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " Apr 22 14:29:39.038006 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.037978 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2fb08a8-0845-424e-b738-aba3f384d02c-tls-certs\") pod \"b2fb08a8-0845-424e-b738-aba3f384d02c\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " Apr 22 14:29:39.038082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.038025 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-home\") pod \"b2fb08a8-0845-424e-b738-aba3f384d02c\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " Apr 22 14:29:39.038082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.038059 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-model-cache\") pod \"b2fb08a8-0845-424e-b738-aba3f384d02c\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " Apr 22 14:29:39.038173 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.038092 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-kserve-provision-location\") pod \"b2fb08a8-0845-424e-b738-aba3f384d02c\" (UID: \"b2fb08a8-0845-424e-b738-aba3f384d02c\") " Apr 22 14:29:39.038452 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.038386 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-home" (OuterVolumeSpecName: "home") pod "b2fb08a8-0845-424e-b738-aba3f384d02c" (UID: "b2fb08a8-0845-424e-b738-aba3f384d02c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:39.038452 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.038412 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-model-cache" (OuterVolumeSpecName: "model-cache") pod "b2fb08a8-0845-424e-b738-aba3f384d02c" (UID: "b2fb08a8-0845-424e-b738-aba3f384d02c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:39.040593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.040562 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b2fb08a8-0845-424e-b738-aba3f384d02c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b2fb08a8-0845-424e-b738-aba3f384d02c" (UID: "b2fb08a8-0845-424e-b738-aba3f384d02c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:29:39.040593 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.040587 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-dshm" (OuterVolumeSpecName: "dshm") pod "b2fb08a8-0845-424e-b738-aba3f384d02c" (UID: "b2fb08a8-0845-424e-b738-aba3f384d02c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:39.040746 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.040641 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b2fb08a8-0845-424e-b738-aba3f384d02c-kube-api-access-phtn5" (OuterVolumeSpecName: "kube-api-access-phtn5") pod "b2fb08a8-0845-424e-b738-aba3f384d02c" (UID: "b2fb08a8-0845-424e-b738-aba3f384d02c"). InnerVolumeSpecName "kube-api-access-phtn5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:29:39.093039 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.092989 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b2fb08a8-0845-424e-b738-aba3f384d02c" (UID: "b2fb08a8-0845-424e-b738-aba3f384d02c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:39.139196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.139153 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-phtn5\" (UniqueName: \"kubernetes.io/projected/b2fb08a8-0845-424e-b738-aba3f384d02c-kube-api-access-phtn5\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:39.139196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.139196 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b2fb08a8-0845-424e-b738-aba3f384d02c-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:39.139399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.139212 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-home\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:39.139399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.139221 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-model-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:39.139399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.139231 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:39.139399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.139241 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b2fb08a8-0845-424e-b738-aba3f384d02c-dshm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:39.268207 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.268172 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx"] Apr 22 14:29:39.272087 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.272060 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-6857dbdbf5-gv4xx"] Apr 22 14:29:39.951946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.951907 2578 generic.go:358] "Generic (PLEG): container finished" podID="ee182599-06bd-406b-850e-1a032889da18" containerID="70199e506da8797d9448559bbad932d8e71a9f78ca1bb0deae485037303e51ce" exitCode=0 Apr 22 14:29:39.953047 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.953016 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" path="/var/lib/kubelet/pods/b2fb08a8-0845-424e-b738-aba3f384d02c/volumes" Apr 22 14:29:39.953421 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:39.953398 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerDied","Data":"70199e506da8797d9448559bbad932d8e71a9f78ca1bb0deae485037303e51ce"} Apr 22 14:29:40.043163 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.043132 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:40.147777 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.147744 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-cache\") pod \"ee182599-06bd-406b-850e-1a032889da18\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " Apr 22 14:29:40.147937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.147788 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee182599-06bd-406b-850e-1a032889da18-tls-certs\") pod \"ee182599-06bd-406b-850e-1a032889da18\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " Apr 22 14:29:40.147937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.147821 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-kserve-provision-location\") pod \"ee182599-06bd-406b-850e-1a032889da18\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " Apr 22 14:29:40.147937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.147898 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-w7mpm\" (UniqueName: \"kubernetes.io/projected/ee182599-06bd-406b-850e-1a032889da18-kube-api-access-w7mpm\") pod \"ee182599-06bd-406b-850e-1a032889da18\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " Apr 22 14:29:40.147937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.147926 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-uds\") pod \"ee182599-06bd-406b-850e-1a032889da18\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " Apr 22 14:29:40.148121 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.147962 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-tmp\") pod \"ee182599-06bd-406b-850e-1a032889da18\" (UID: \"ee182599-06bd-406b-850e-1a032889da18\") " Apr 22 14:29:40.148121 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.148069 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "ee182599-06bd-406b-850e-1a032889da18" (UID: "ee182599-06bd-406b-850e-1a032889da18"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:40.148259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.148239 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:40.148344 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.148259 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "ee182599-06bd-406b-850e-1a032889da18" (UID: "ee182599-06bd-406b-850e-1a032889da18"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:40.148441 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.148412 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "ee182599-06bd-406b-850e-1a032889da18" (UID: "ee182599-06bd-406b-850e-1a032889da18"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:40.148713 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.148690 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ee182599-06bd-406b-850e-1a032889da18" (UID: "ee182599-06bd-406b-850e-1a032889da18"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:29:40.150119 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.150088 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ee182599-06bd-406b-850e-1a032889da18-kube-api-access-w7mpm" (OuterVolumeSpecName: "kube-api-access-w7mpm") pod "ee182599-06bd-406b-850e-1a032889da18" (UID: "ee182599-06bd-406b-850e-1a032889da18"). InnerVolumeSpecName "kube-api-access-w7mpm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:29:40.150119 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.150107 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ee182599-06bd-406b-850e-1a032889da18-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ee182599-06bd-406b-850e-1a032889da18" (UID: "ee182599-06bd-406b-850e-1a032889da18"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:29:40.249561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.249477 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-w7mpm\" (UniqueName: \"kubernetes.io/projected/ee182599-06bd-406b-850e-1a032889da18-kube-api-access-w7mpm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:40.249561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.249512 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:40.249561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.249522 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:40.249561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.249531 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ee182599-06bd-406b-850e-1a032889da18-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:40.249561 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.249540 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ee182599-06bd-406b-850e-1a032889da18-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:29:40.957504 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.957467 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" event={"ID":"ee182599-06bd-406b-850e-1a032889da18","Type":"ContainerDied","Data":"f259a7fcfc1df35dabd489093c2492e6fd07fc5b4f62fe228a7170fb669dca8c"} Apr 22 14:29:40.957910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.957519 2578 scope.go:117] "RemoveContainer" containerID="70199e506da8797d9448559bbad932d8e71a9f78ca1bb0deae485037303e51ce" Apr 22 14:29:40.957910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.957479 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx" Apr 22 14:29:40.966511 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.966491 2578 scope.go:117] "RemoveContainer" containerID="48a1e5111e6b4f2bb95c2c2d3ad00aecfe0316f2a4258d52fde3225cf9416ae6" Apr 22 14:29:40.974038 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.974017 2578 scope.go:117] "RemoveContainer" containerID="f9d366ed03fe9ae3047722b37ecfb9a849475748332d08496cb1668b8e651c62" Apr 22 14:29:40.981154 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.981126 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx"] Apr 22 14:29:40.985093 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:40.985059 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-7b879b8zw8dx"] Apr 22 14:29:41.952022 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:41.951989 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ee182599-06bd-406b-850e-1a032889da18" path="/var/lib/kubelet/pods/ee182599-06bd-406b-850e-1a032889da18/volumes" Apr 22 14:29:47.906366 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:47.906328 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:29:52.690915 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.690876 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf"] Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691201 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="main" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691212 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="main" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691228 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerName="main" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691234 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerName="main" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691241 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerName="storage-initializer" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691247 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerName="storage-initializer" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691261 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="tokenizer" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691266 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="tokenizer" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691273 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="storage-initializer" Apr 22 14:29:52.691340 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691279 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="storage-initializer" Apr 22 14:29:52.691690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691359 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="tokenizer" Apr 22 14:29:52.691690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691368 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b2fb08a8-0845-424e-b738-aba3f384d02c" containerName="main" Apr 22 14:29:52.691690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.691377 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="ee182599-06bd-406b-850e-1a032889da18" containerName="main" Apr 22 14:29:52.694464 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.694446 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.697243 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.697218 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"precise-prefix-cache-test-kserve-self-signed-certs\"" Apr 22 14:29:52.713940 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.713912 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf"] Apr 22 14:29:52.861855 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.861818 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-model-cache\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.862043 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.861874 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-home\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.862043 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.861923 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkfdq\" (UniqueName: \"kubernetes.io/projected/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kube-api-access-dkfdq\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.862043 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.861945 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.862043 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.861964 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-dshm\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.862193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.862046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.962738 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.962648 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.962738 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.962711 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-model-cache\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.962961 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.962770 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-home\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.962961 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.962797 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkfdq\" (UniqueName: \"kubernetes.io/projected/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kube-api-access-dkfdq\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.962961 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.962831 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.962961 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.962859 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-dshm\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.963196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.963171 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kserve-provision-location\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.963196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.963189 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-model-cache\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.963319 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.963216 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-home\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.965130 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.965099 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-dshm\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.965409 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.965391 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-tls-certs\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:52.980947 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:52.980926 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkfdq\" (UniqueName: \"kubernetes.io/projected/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kube-api-access-dkfdq\") pod \"precise-prefix-cache-test-kserve-646bf96947-25kbf\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:53.005170 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:53.005139 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:29:53.133881 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:53.133853 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf"] Apr 22 14:29:53.136388 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:29:53.136350 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podb5197bb2_b5e5_4e66_8c1e_e72a9cb3b8d8.slice/crio-727ceb491583986999aa22391acbe59cf7764e8ca9e24d39652afc341106662b WatchSource:0}: Error finding container 727ceb491583986999aa22391acbe59cf7764e8ca9e24d39652afc341106662b: Status 404 returned error can't find the container with id 727ceb491583986999aa22391acbe59cf7764e8ca9e24d39652afc341106662b Apr 22 14:29:53.999365 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:53.999324 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" event={"ID":"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8","Type":"ContainerStarted","Data":"2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936"} Apr 22 14:29:53.999365 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:53.999368 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" event={"ID":"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8","Type":"ContainerStarted","Data":"727ceb491583986999aa22391acbe59cf7764e8ca9e24d39652afc341106662b"} Apr 22 14:29:58.015495 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:58.015448 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerID="2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936" exitCode=0 Apr 22 14:29:58.015899 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:58.015517 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" event={"ID":"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8","Type":"ContainerDied","Data":"2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936"} Apr 22 14:29:59.020983 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:59.020941 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" event={"ID":"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8","Type":"ContainerStarted","Data":"86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313"} Apr 22 14:29:59.044895 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:59.044837 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" podStartSLOduration=7.044820593 podStartE2EDuration="7.044820593s" podCreationTimestamp="2026-04-22 14:29:52 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:29:59.043518052 +0000 UTC m=+899.718321315" watchObservedRunningTime="2026-04-22 14:29:59.044820593 +0000 UTC m=+899.719623857" Apr 22 14:29:59.887197 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:59.887169 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:29:59.889890 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:29:59.889862 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:30:03.005278 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:03.005247 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:30:03.005713 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:03.005292 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:30:03.017809 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:03.017782 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:30:03.045982 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:03.045953 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:30:35.287416 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.287381 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf"] Apr 22 14:30:35.289813 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.287724 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerName="main" containerID="cri-o://86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313" gracePeriod=30 Apr 22 14:30:35.536971 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.536936 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:30:35.615183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615107 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-tls-certs\") pod \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " Apr 22 14:30:35.615183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615147 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kserve-provision-location\") pod \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " Apr 22 14:30:35.615183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615177 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-home\") pod \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " Apr 22 14:30:35.615434 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615221 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkfdq\" (UniqueName: \"kubernetes.io/projected/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kube-api-access-dkfdq\") pod \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " Apr 22 14:30:35.615434 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615267 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-model-cache\") pod \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " Apr 22 14:30:35.615524 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615438 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-dshm\") pod \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\" (UID: \"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8\") " Apr 22 14:30:35.615581 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615530 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-home" (OuterVolumeSpecName: "home") pod "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" (UID: "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.615638 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615615 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-model-cache" (OuterVolumeSpecName: "model-cache") pod "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" (UID: "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.615873 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615809 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-home\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.615873 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.615834 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-model-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.617522 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.617495 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" (UID: "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:30:35.617522 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.617509 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kube-api-access-dkfdq" (OuterVolumeSpecName: "kube-api-access-dkfdq") pod "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" (UID: "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8"). InnerVolumeSpecName "kube-api-access-dkfdq". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:30:35.617780 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.617766 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-dshm" (OuterVolumeSpecName: "dshm") pod "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" (UID: "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.670259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.670190 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" (UID: "b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:30:35.717270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.717235 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.717270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.717265 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.717270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.717276 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dkfdq\" (UniqueName: \"kubernetes.io/projected/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-kube-api-access-dkfdq\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:30:35.717519 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:35.717285 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8-dshm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:30:36.147263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.147226 2578 generic.go:358] "Generic (PLEG): container finished" podID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerID="86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313" exitCode=0 Apr 22 14:30:36.147454 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.147288 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" Apr 22 14:30:36.147454 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.147329 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" event={"ID":"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8","Type":"ContainerDied","Data":"86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313"} Apr 22 14:30:36.147454 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.147373 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf" event={"ID":"b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8","Type":"ContainerDied","Data":"727ceb491583986999aa22391acbe59cf7764e8ca9e24d39652afc341106662b"} Apr 22 14:30:36.147454 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.147390 2578 scope.go:117] "RemoveContainer" containerID="86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313" Apr 22 14:30:36.155906 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.155891 2578 scope.go:117] "RemoveContainer" containerID="2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936" Apr 22 14:30:36.166916 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.166883 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf"] Apr 22 14:30:36.172596 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.172569 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/precise-prefix-cache-test-kserve-646bf96947-25kbf"] Apr 22 14:30:36.218529 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.218495 2578 scope.go:117] "RemoveContainer" containerID="86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313" Apr 22 14:30:36.218910 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:30:36.218886 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313\": container with ID starting with 86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313 not found: ID does not exist" containerID="86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313" Apr 22 14:30:36.218965 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.218925 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313"} err="failed to get container status \"86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313\": rpc error: code = NotFound desc = could not find container \"86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313\": container with ID starting with 86dcb3b936ca3b77026c0712cceb3714d917a733732372147497ff215e997313 not found: ID does not exist" Apr 22 14:30:36.218965 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.218950 2578 scope.go:117] "RemoveContainer" containerID="2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936" Apr 22 14:30:36.219345 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:30:36.219320 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936\": container with ID starting with 2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936 not found: ID does not exist" containerID="2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936" Apr 22 14:30:36.219444 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:36.219351 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936"} err="failed to get container status \"2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936\": rpc error: code = NotFound desc = could not find container \"2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936\": container with ID starting with 2fe124edd4284feadb42441b53d94bb2bf5a3d7d3fdfc2a801e4d79c438c6936 not found: ID does not exist" Apr 22 14:30:37.951654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:37.951575 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" path="/var/lib/kubelet/pods/b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8/volumes" Apr 22 14:30:53.957702 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.957666 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf"] Apr 22 14:30:53.958249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.958000 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerName="main" Apr 22 14:30:53.958249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.958010 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerName="main" Apr 22 14:30:53.958249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.958038 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerName="storage-initializer" Apr 22 14:30:53.958249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.958044 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerName="storage-initializer" Apr 22 14:30:53.958249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.958095 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="b5197bb2-b5e5-4e66-8c1e-e72a9cb3b8d8" containerName="main" Apr 22 14:30:53.961503 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.961482 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:53.964274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.964249 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 14:30:53.964399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.964251 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-tf9mb\"" Apr 22 14:30:53.971853 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:53.971830 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf"] Apr 22 14:30:54.076534 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.076494 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.076712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.076537 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.076712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.076635 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c61091-a891-42d3-8318-a5980c19d84c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.076712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.076667 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.076712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.076698 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.076848 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.076830 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4mscg\" (UniqueName: \"kubernetes.io/projected/f2c61091-a891-42d3-8318-a5980c19d84c-kube-api-access-4mscg\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.177824 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.177788 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4mscg\" (UniqueName: \"kubernetes.io/projected/f2c61091-a891-42d3-8318-a5980c19d84c-kube-api-access-4mscg\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.177980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.177845 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.177980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.177867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.177980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.177887 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c61091-a891-42d3-8318-a5980c19d84c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.177980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.177906 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.177980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.177929 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.178371 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.178341 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.178478 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.178402 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.178478 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.178416 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.178478 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.178465 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.180520 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.180501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c61091-a891-42d3-8318-a5980c19d84c-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.187807 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.187784 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4mscg\" (UniqueName: \"kubernetes.io/projected/f2c61091-a891-42d3-8318-a5980c19d84c-kube-api-access-4mscg\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.271255 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.271166 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:54.396574 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:54.396472 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf"] Apr 22 14:30:54.398778 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:30:54.398749 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podf2c61091_a891_42d3_8318_a5980c19d84c.slice/crio-3025f053d5e53e5fa14751d8798d7573b0481f96b38d31d1932f6b42bebfcfae WatchSource:0}: Error finding container 3025f053d5e53e5fa14751d8798d7573b0481f96b38d31d1932f6b42bebfcfae: Status 404 returned error can't find the container with id 3025f053d5e53e5fa14751d8798d7573b0481f96b38d31d1932f6b42bebfcfae Apr 22 14:30:55.216586 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:55.216545 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerStarted","Data":"e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014"} Apr 22 14:30:55.216586 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:55.216586 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerStarted","Data":"3025f053d5e53e5fa14751d8798d7573b0481f96b38d31d1932f6b42bebfcfae"} Apr 22 14:30:56.220992 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:56.220956 2578 generic.go:358] "Generic (PLEG): container finished" podID="f2c61091-a891-42d3-8318-a5980c19d84c" containerID="e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014" exitCode=0 Apr 22 14:30:56.221455 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:56.221015 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerDied","Data":"e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014"} Apr 22 14:30:57.226287 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:57.226249 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerStarted","Data":"a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e"} Apr 22 14:30:57.226287 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:57.226289 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerStarted","Data":"86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65"} Apr 22 14:30:57.226722 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:57.226504 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:30:57.246172 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:30:57.246107 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" podStartSLOduration=4.246088663 podStartE2EDuration="4.246088663s" podCreationTimestamp="2026-04-22 14:30:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:30:57.245761939 +0000 UTC m=+957.920565220" watchObservedRunningTime="2026-04-22 14:30:57.246088663 +0000 UTC m=+957.920891929" Apr 22 14:31:04.271835 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:04.271795 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:31:04.272282 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:04.271972 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:31:04.274611 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:04.274587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:31:05.254187 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:05.254155 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:31:27.260535 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:27.260449 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:31:43.012935 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:43.012895 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm"] Apr 22 14:31:43.013414 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:43.013349 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="main" containerID="cri-o://8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33" gracePeriod=30 Apr 22 14:31:43.013414 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:43.013396 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="tokenizer" containerID="cri-o://7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414" gracePeriod=30 Apr 22 14:31:43.381591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:43.381553 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d98b2ec-35eb-4123-b664-797da96d3339" containerID="8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33" exitCode=0 Apr 22 14:31:43.381770 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:43.381633 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerDied","Data":"8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33"} Apr 22 14:31:44.355021 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.354992 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:31:44.388099 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.388062 2578 generic.go:358] "Generic (PLEG): container finished" podID="6d98b2ec-35eb-4123-b664-797da96d3339" containerID="7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414" exitCode=0 Apr 22 14:31:44.388253 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.388145 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" Apr 22 14:31:44.388253 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.388141 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerDied","Data":"7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414"} Apr 22 14:31:44.388253 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.388237 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm" event={"ID":"6d98b2ec-35eb-4123-b664-797da96d3339","Type":"ContainerDied","Data":"a5a9fa76177f8c3c7be32cee176dc168a89e3339069d4bb455a058ec789c5faf"} Apr 22 14:31:44.388391 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.388256 2578 scope.go:117] "RemoveContainer" containerID="7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414" Apr 22 14:31:44.396922 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.396899 2578 scope.go:117] "RemoveContainer" containerID="8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33" Apr 22 14:31:44.404983 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.404964 2578 scope.go:117] "RemoveContainer" containerID="2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd" Apr 22 14:31:44.412412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.412385 2578 scope.go:117] "RemoveContainer" containerID="7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414" Apr 22 14:31:44.412672 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:31:44.412653 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414\": container with ID starting with 7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414 not found: ID does not exist" containerID="7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414" Apr 22 14:31:44.412726 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.412682 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414"} err="failed to get container status \"7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414\": rpc error: code = NotFound desc = could not find container \"7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414\": container with ID starting with 7d2e53048a37baee77692fea58ffa88ecfc5db4a222e9a9c0dc2ad5eb4b16414 not found: ID does not exist" Apr 22 14:31:44.412726 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.412702 2578 scope.go:117] "RemoveContainer" containerID="8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33" Apr 22 14:31:44.412913 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:31:44.412899 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33\": container with ID starting with 8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33 not found: ID does not exist" containerID="8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33" Apr 22 14:31:44.412957 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.412914 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33"} err="failed to get container status \"8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33\": rpc error: code = NotFound desc = could not find container \"8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33\": container with ID starting with 8c131c7f23092ce54a884e5b938b313f5dced9972934cede1aab8b425ec1fc33 not found: ID does not exist" Apr 22 14:31:44.412957 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.412926 2578 scope.go:117] "RemoveContainer" containerID="2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd" Apr 22 14:31:44.413166 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:31:44.413146 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd\": container with ID starting with 2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd not found: ID does not exist" containerID="2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd" Apr 22 14:31:44.413226 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.413175 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd"} err="failed to get container status \"2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd\": rpc error: code = NotFound desc = could not find container \"2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd\": container with ID starting with 2d451db0f0620d0e58de5d3a9164241f84a7a445401cba56dd3d0942253511dd not found: ID does not exist" Apr 22 14:31:44.502518 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502480 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-kserve-provision-location\") pod \"6d98b2ec-35eb-4123-b664-797da96d3339\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " Apr 22 14:31:44.502694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502530 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sqrw6\" (UniqueName: \"kubernetes.io/projected/6d98b2ec-35eb-4123-b664-797da96d3339-kube-api-access-sqrw6\") pod \"6d98b2ec-35eb-4123-b664-797da96d3339\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " Apr 22 14:31:44.502694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502551 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-tmp\") pod \"6d98b2ec-35eb-4123-b664-797da96d3339\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " Apr 22 14:31:44.502694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502582 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d98b2ec-35eb-4123-b664-797da96d3339-tls-certs\") pod \"6d98b2ec-35eb-4123-b664-797da96d3339\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " Apr 22 14:31:44.502694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502608 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-uds\") pod \"6d98b2ec-35eb-4123-b664-797da96d3339\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " Apr 22 14:31:44.502694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502671 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-cache\") pod \"6d98b2ec-35eb-4123-b664-797da96d3339\" (UID: \"6d98b2ec-35eb-4123-b664-797da96d3339\") " Apr 22 14:31:44.502957 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502900 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6d98b2ec-35eb-4123-b664-797da96d3339" (UID: "6d98b2ec-35eb-4123-b664-797da96d3339"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:44.503015 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.502986 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6d98b2ec-35eb-4123-b664-797da96d3339" (UID: "6d98b2ec-35eb-4123-b664-797da96d3339"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:44.503059 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.503028 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6d98b2ec-35eb-4123-b664-797da96d3339" (UID: "6d98b2ec-35eb-4123-b664-797da96d3339"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:44.503361 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.503336 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6d98b2ec-35eb-4123-b664-797da96d3339" (UID: "6d98b2ec-35eb-4123-b664-797da96d3339"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:31:44.504753 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.504722 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6d98b2ec-35eb-4123-b664-797da96d3339-kube-api-access-sqrw6" (OuterVolumeSpecName: "kube-api-access-sqrw6") pod "6d98b2ec-35eb-4123-b664-797da96d3339" (UID: "6d98b2ec-35eb-4123-b664-797da96d3339"). InnerVolumeSpecName "kube-api-access-sqrw6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:31:44.504862 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.504786 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6d98b2ec-35eb-4123-b664-797da96d3339-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6d98b2ec-35eb-4123-b664-797da96d3339" (UID: "6d98b2ec-35eb-4123-b664-797da96d3339"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:31:44.604149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.604094 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:31:44.604149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.604141 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:31:44.604149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.604152 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sqrw6\" (UniqueName: \"kubernetes.io/projected/6d98b2ec-35eb-4123-b664-797da96d3339-kube-api-access-sqrw6\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:31:44.604149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.604164 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:31:44.604458 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.604175 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6d98b2ec-35eb-4123-b664-797da96d3339-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:31:44.604458 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.604184 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6d98b2ec-35eb-4123-b664-797da96d3339-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:31:44.713551 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.713518 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm"] Apr 22 14:31:44.718477 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:44.718449 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvcdde380eaa9fe1facad32d45131f9e34d-kserve-router-schel9zpm"] Apr 22 14:31:45.952426 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:45.952384 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" path="/var/lib/kubelet/pods/6d98b2ec-35eb-4123-b664-797da96d3339/volumes" Apr 22 14:31:55.008046 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008015 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp"] Apr 22 14:31:55.008472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008455 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="tokenizer" Apr 22 14:31:55.008472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008474 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="tokenizer" Apr 22 14:31:55.008553 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008486 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="main" Apr 22 14:31:55.008553 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008491 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="main" Apr 22 14:31:55.008553 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008503 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="storage-initializer" Apr 22 14:31:55.008553 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008509 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="storage-initializer" Apr 22 14:31:55.008690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008575 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="main" Apr 22 14:31:55.008690 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.008582 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6d98b2ec-35eb-4123-b664-797da96d3339" containerName="tokenizer" Apr 22 14:31:55.011668 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.011647 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.014389 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.014365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-epp-sa-dockercfg-zt2bx\"" Apr 22 14:31:55.014558 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.014365 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 22 14:31:55.022695 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.022671 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp"] Apr 22 14:31:55.101191 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.101160 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c86147a5-270f-4580-b2a6-88aa5514adf4-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.101429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.101216 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.101429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.101275 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.101429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.101333 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.101429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.101404 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.101591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.101437 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zczds\" (UniqueName: \"kubernetes.io/projected/c86147a5-270f-4580-b2a6-88aa5514adf4-kube-api-access-zczds\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.201938 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.201901 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c86147a5-270f-4580-b2a6-88aa5514adf4-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.201952 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.201975 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.201999 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.202023 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202226 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.202052 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zczds\" (UniqueName: \"kubernetes.io/projected/c86147a5-270f-4580-b2a6-88aa5514adf4-kube-api-access-zczds\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202507 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.202482 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-uds\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202588 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.202540 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202588 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.202555 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-tmp\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.202652 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.202587 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-cache\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.204438 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.204419 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c86147a5-270f-4580-b2a6-88aa5514adf4-tls-certs\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.211963 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.211933 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zczds\" (UniqueName: \"kubernetes.io/projected/c86147a5-270f-4580-b2a6-88aa5514adf4-kube-api-access-zczds\") pod \"custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.321667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.321567 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:55.449846 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:55.449818 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp"] Apr 22 14:31:55.452339 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:31:55.452284 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc86147a5_270f_4580_b2a6_88aa5514adf4.slice/crio-4f9d757a520466ff9f272d018c2cc5c13dcea48fa3b963db8ac5cb0ee5ff36bf WatchSource:0}: Error finding container 4f9d757a520466ff9f272d018c2cc5c13dcea48fa3b963db8ac5cb0ee5ff36bf: Status 404 returned error can't find the container with id 4f9d757a520466ff9f272d018c2cc5c13dcea48fa3b963db8ac5cb0ee5ff36bf Apr 22 14:31:56.436994 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:56.436959 2578 generic.go:358] "Generic (PLEG): container finished" podID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerID="05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606" exitCode=0 Apr 22 14:31:56.437497 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:56.437045 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerDied","Data":"05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606"} Apr 22 14:31:56.437497 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:56.437094 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerStarted","Data":"4f9d757a520466ff9f272d018c2cc5c13dcea48fa3b963db8ac5cb0ee5ff36bf"} Apr 22 14:31:57.442253 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:57.442218 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerStarted","Data":"0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a"} Apr 22 14:31:57.442253 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:57.442258 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerStarted","Data":"fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34"} Apr 22 14:31:57.442701 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:57.442443 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:31:57.463090 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:31:57.463038 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" podStartSLOduration=3.463024924 podStartE2EDuration="3.463024924s" podCreationTimestamp="2026-04-22 14:31:54 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:31:57.461149492 +0000 UTC m=+1018.135952760" watchObservedRunningTime="2026-04-22 14:31:57.463024924 +0000 UTC m=+1018.137828188" Apr 22 14:32:05.321950 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:05.321909 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:32:05.321950 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:05.321960 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:32:05.325007 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:05.324974 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:32:05.473478 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:05.473443 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:32:26.474117 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:26.474089 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:32:45.619867 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:45.619780 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf"] Apr 22 14:32:45.620429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:45.620151 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="main" containerID="cri-o://86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65" gracePeriod=30 Apr 22 14:32:45.620429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:45.620185 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="tokenizer" containerID="cri-o://a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e" gracePeriod=30 Apr 22 14:32:46.608496 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:46.608462 2578 generic.go:358] "Generic (PLEG): container finished" podID="f2c61091-a891-42d3-8318-a5980c19d84c" containerID="86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65" exitCode=0 Apr 22 14:32:46.608659 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:46.608533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerDied","Data":"86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65"} Apr 22 14:32:46.965212 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:46.965189 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:32:47.045444 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045410 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-tmp\") pod \"f2c61091-a891-42d3-8318-a5980c19d84c\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " Apr 22 14:32:47.045623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045451 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-cache\") pod \"f2c61091-a891-42d3-8318-a5980c19d84c\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " Apr 22 14:32:47.045623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045486 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-kserve-provision-location\") pod \"f2c61091-a891-42d3-8318-a5980c19d84c\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " Apr 22 14:32:47.045623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045520 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c61091-a891-42d3-8318-a5980c19d84c-tls-certs\") pod \"f2c61091-a891-42d3-8318-a5980c19d84c\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " Apr 22 14:32:47.045623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045562 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-uds\") pod \"f2c61091-a891-42d3-8318-a5980c19d84c\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " Apr 22 14:32:47.045832 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045622 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4mscg\" (UniqueName: \"kubernetes.io/projected/f2c61091-a891-42d3-8318-a5980c19d84c-kube-api-access-4mscg\") pod \"f2c61091-a891-42d3-8318-a5980c19d84c\" (UID: \"f2c61091-a891-42d3-8318-a5980c19d84c\") " Apr 22 14:32:47.045832 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045792 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "f2c61091-a891-42d3-8318-a5980c19d84c" (UID: "f2c61091-a891-42d3-8318-a5980c19d84c"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:47.045928 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045820 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "f2c61091-a891-42d3-8318-a5980c19d84c" (UID: "f2c61091-a891-42d3-8318-a5980c19d84c"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:47.045928 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045869 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "f2c61091-a891-42d3-8318-a5980c19d84c" (UID: "f2c61091-a891-42d3-8318-a5980c19d84c"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:47.046027 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045942 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:32:47.046027 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.045958 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:32:47.046122 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.046037 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:32:47.046261 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.046236 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "f2c61091-a891-42d3-8318-a5980c19d84c" (UID: "f2c61091-a891-42d3-8318-a5980c19d84c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:32:47.047825 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.047806 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/f2c61091-a891-42d3-8318-a5980c19d84c-kube-api-access-4mscg" (OuterVolumeSpecName: "kube-api-access-4mscg") pod "f2c61091-a891-42d3-8318-a5980c19d84c" (UID: "f2c61091-a891-42d3-8318-a5980c19d84c"). InnerVolumeSpecName "kube-api-access-4mscg". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:32:47.047887 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.047857 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/f2c61091-a891-42d3-8318-a5980c19d84c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "f2c61091-a891-42d3-8318-a5980c19d84c" (UID: "f2c61091-a891-42d3-8318-a5980c19d84c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:32:47.147185 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.147095 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4mscg\" (UniqueName: \"kubernetes.io/projected/f2c61091-a891-42d3-8318-a5980c19d84c-kube-api-access-4mscg\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:32:47.147185 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.147130 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/f2c61091-a891-42d3-8318-a5980c19d84c-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:32:47.147185 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.147141 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/f2c61091-a891-42d3-8318-a5980c19d84c-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:32:47.614183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.614092 2578 generic.go:358] "Generic (PLEG): container finished" podID="f2c61091-a891-42d3-8318-a5980c19d84c" containerID="a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e" exitCode=0 Apr 22 14:32:47.614335 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.614187 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerDied","Data":"a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e"} Apr 22 14:32:47.614335 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.614214 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" Apr 22 14:32:47.614335 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.614230 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf" event={"ID":"f2c61091-a891-42d3-8318-a5980c19d84c","Type":"ContainerDied","Data":"3025f053d5e53e5fa14751d8798d7573b0481f96b38d31d1932f6b42bebfcfae"} Apr 22 14:32:47.614335 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.614248 2578 scope.go:117] "RemoveContainer" containerID="a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e" Apr 22 14:32:47.622855 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.622829 2578 scope.go:117] "RemoveContainer" containerID="86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65" Apr 22 14:32:47.629956 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.629941 2578 scope.go:117] "RemoveContainer" containerID="e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014" Apr 22 14:32:47.636785 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.636759 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf"] Apr 22 14:32:47.637909 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.637896 2578 scope.go:117] "RemoveContainer" containerID="a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e" Apr 22 14:32:47.638192 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:32:47.638146 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e\": container with ID starting with a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e not found: ID does not exist" containerID="a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e" Apr 22 14:32:47.638192 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.638177 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e"} err="failed to get container status \"a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e\": rpc error: code = NotFound desc = could not find container \"a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e\": container with ID starting with a2ebaa70e453baae59e219fb8565e03fcfd6b21b33a3a38ed90ba66830cf673e not found: ID does not exist" Apr 22 14:32:47.638192 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.638194 2578 scope.go:117] "RemoveContainer" containerID="86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65" Apr 22 14:32:47.638743 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:32:47.638669 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65\": container with ID starting with 86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65 not found: ID does not exist" containerID="86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65" Apr 22 14:32:47.638743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.638711 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65"} err="failed to get container status \"86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65\": rpc error: code = NotFound desc = could not find container \"86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65\": container with ID starting with 86e63199fd9b5dc1dc6bf402b506737467206ccd129ae1b272c6379d9d185a65 not found: ID does not exist" Apr 22 14:32:47.638743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.638734 2578 scope.go:117] "RemoveContainer" containerID="e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014" Apr 22 14:32:47.639201 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:32:47.639165 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014\": container with ID starting with e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014 not found: ID does not exist" containerID="e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014" Apr 22 14:32:47.639201 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.639194 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014"} err="failed to get container status \"e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014\": rpc error: code = NotFound desc = could not find container \"e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014\": container with ID starting with e01eab006e0c38e8575f476139433936dca7d05dd17138fb87abec26a55c4014 not found: ID does not exist" Apr 22 14:32:47.640522 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.640502 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-q7hcf"] Apr 22 14:32:47.952405 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:32:47.952367 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" path="/var/lib/kubelet/pods/f2c61091-a891-42d3-8318-a5980c19d84c/volumes" Apr 22 14:33:06.277159 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277125 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q"] Apr 22 14:33:06.277576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277502 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="storage-initializer" Apr 22 14:33:06.277576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277515 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="storage-initializer" Apr 22 14:33:06.277576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277523 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="main" Apr 22 14:33:06.277576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277528 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="main" Apr 22 14:33:06.277576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277541 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="tokenizer" Apr 22 14:33:06.277576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277547 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="tokenizer" Apr 22 14:33:06.277768 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277596 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="main" Apr 22 14:33:06.277768 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.277609 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="f2c61091-a891-42d3-8318-a5980c19d84c" containerName="tokenizer" Apr 22 14:33:06.282372 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.282352 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.287030 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.287005 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-epp-sa-dockercfg-qkw2v\"" Apr 22 14:33:06.287165 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.287099 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 22 14:33:06.299426 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299398 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6bkf\" (UniqueName: \"kubernetes.io/projected/15f358a2-34a9-4ba9-a347-e7354c0721c9-kube-api-access-t6bkf\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.299594 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299429 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.299594 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299456 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.299594 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299533 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q"] Apr 22 14:33:06.299594 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299542 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.299781 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299645 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.299781 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.299701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15f358a2-34a9-4ba9-a347-e7354c0721c9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400114 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6bkf\" (UniqueName: \"kubernetes.io/projected/15f358a2-34a9-4ba9-a347-e7354c0721c9-kube-api-access-t6bkf\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400114 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400117 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400398 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400156 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400398 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400198 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400398 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400258 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400398 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400341 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15f358a2-34a9-4ba9-a347-e7354c0721c9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400653 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-kserve-provision-location\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400683 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-tmp\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400789 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400701 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-cache\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.400789 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.400735 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-uds\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.402849 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.402831 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15f358a2-34a9-4ba9-a347-e7354c0721c9-tls-certs\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.407861 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.407834 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6bkf\" (UniqueName: \"kubernetes.io/projected/15f358a2-34a9-4ba9-a347-e7354c0721c9-kube-api-access-t6bkf\") pod \"stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.591875 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.591777 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:06.716452 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.716287 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q"] Apr 22 14:33:06.721910 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:33:06.721421 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod15f358a2_34a9_4ba9_a347_e7354c0721c9.slice/crio-ab8c4c09a49c5461cb93623b8e3770e5ce2c928ef5ea723a0cada10c055988e5 WatchSource:0}: Error finding container ab8c4c09a49c5461cb93623b8e3770e5ce2c928ef5ea723a0cada10c055988e5: Status 404 returned error can't find the container with id ab8c4c09a49c5461cb93623b8e3770e5ce2c928ef5ea723a0cada10c055988e5 Apr 22 14:33:06.723749 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:06.723729 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:33:07.687943 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:07.687907 2578 generic.go:358] "Generic (PLEG): container finished" podID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerID="5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287" exitCode=0 Apr 22 14:33:07.688385 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:07.687978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerDied","Data":"5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287"} Apr 22 14:33:07.688385 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:07.688003 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerStarted","Data":"ab8c4c09a49c5461cb93623b8e3770e5ce2c928ef5ea723a0cada10c055988e5"} Apr 22 14:33:08.694006 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:08.693970 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerStarted","Data":"69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370"} Apr 22 14:33:08.694006 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:08.694007 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerStarted","Data":"f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7"} Apr 22 14:33:08.694455 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:08.694105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:08.715890 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:08.715840 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" podStartSLOduration=2.715826032 podStartE2EDuration="2.715826032s" podCreationTimestamp="2026-04-22 14:33:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:33:08.71390229 +0000 UTC m=+1089.388705555" watchObservedRunningTime="2026-04-22 14:33:08.715826032 +0000 UTC m=+1089.390629296" Apr 22 14:33:16.592540 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:16.592499 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:16.592540 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:16.592542 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:16.595239 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:16.595212 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:16.720658 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:16.720619 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:37.725213 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:37.725184 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:33:40.956637 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:40.956604 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp"] Apr 22 14:33:40.957019 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:40.956892 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="main" containerID="cri-o://fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34" gracePeriod=30 Apr 22 14:33:40.957019 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:40.956936 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="tokenizer" containerID="cri-o://0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a" gracePeriod=30 Apr 22 14:33:41.811629 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:41.811594 2578 generic.go:358] "Generic (PLEG): container finished" podID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerID="fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34" exitCode=0 Apr 22 14:33:41.811804 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:41.811678 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerDied","Data":"fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34"} Apr 22 14:33:45.470823 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:45.470774 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.39:8082/healthz\": dial tcp 10.134.0.39:8082: connect: connection refused" Apr 22 14:33:46.208457 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.208429 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:33:46.309472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309379 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c86147a5-270f-4580-b2a6-88aa5514adf4-tls-certs\") pod \"c86147a5-270f-4580-b2a6-88aa5514adf4\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " Apr 22 14:33:46.309472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309418 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-uds\") pod \"c86147a5-270f-4580-b2a6-88aa5514adf4\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " Apr 22 14:33:46.309472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309445 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zczds\" (UniqueName: \"kubernetes.io/projected/c86147a5-270f-4580-b2a6-88aa5514adf4-kube-api-access-zczds\") pod \"c86147a5-270f-4580-b2a6-88aa5514adf4\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " Apr 22 14:33:46.309744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309634 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-tmp\") pod \"c86147a5-270f-4580-b2a6-88aa5514adf4\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " Apr 22 14:33:46.309744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309715 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c86147a5-270f-4580-b2a6-88aa5514adf4" (UID: "c86147a5-270f-4580-b2a6-88aa5514adf4"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:46.309744 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309733 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-cache\") pod \"c86147a5-270f-4580-b2a6-88aa5514adf4\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " Apr 22 14:33:46.309883 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309775 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-kserve-provision-location\") pod \"c86147a5-270f-4580-b2a6-88aa5514adf4\" (UID: \"c86147a5-270f-4580-b2a6-88aa5514adf4\") " Apr 22 14:33:46.309998 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309971 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c86147a5-270f-4580-b2a6-88aa5514adf4" (UID: "c86147a5-270f-4580-b2a6-88aa5514adf4"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:46.310061 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.310000 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:33:46.310061 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.309981 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c86147a5-270f-4580-b2a6-88aa5514adf4" (UID: "c86147a5-270f-4580-b2a6-88aa5514adf4"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:46.310712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.310682 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c86147a5-270f-4580-b2a6-88aa5514adf4" (UID: "c86147a5-270f-4580-b2a6-88aa5514adf4"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:33:46.311557 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.311540 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c86147a5-270f-4580-b2a6-88aa5514adf4-kube-api-access-zczds" (OuterVolumeSpecName: "kube-api-access-zczds") pod "c86147a5-270f-4580-b2a6-88aa5514adf4" (UID: "c86147a5-270f-4580-b2a6-88aa5514adf4"). InnerVolumeSpecName "kube-api-access-zczds". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:33:46.311714 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.311700 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c86147a5-270f-4580-b2a6-88aa5514adf4-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c86147a5-270f-4580-b2a6-88aa5514adf4" (UID: "c86147a5-270f-4580-b2a6-88aa5514adf4"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:33:46.410847 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.410808 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zczds\" (UniqueName: \"kubernetes.io/projected/c86147a5-270f-4580-b2a6-88aa5514adf4-kube-api-access-zczds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:33:46.410847 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.410841 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:33:46.410847 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.410852 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:33:46.411082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.410861 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c86147a5-270f-4580-b2a6-88aa5514adf4-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:33:46.411082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.410871 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c86147a5-270f-4580-b2a6-88aa5514adf4-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:33:46.834489 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.834451 2578 generic.go:358] "Generic (PLEG): container finished" podID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerID="0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a" exitCode=0 Apr 22 14:33:46.834898 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.834492 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerDied","Data":"0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a"} Apr 22 14:33:46.834898 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.834524 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" Apr 22 14:33:46.834898 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.834533 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp" event={"ID":"c86147a5-270f-4580-b2a6-88aa5514adf4","Type":"ContainerDied","Data":"4f9d757a520466ff9f272d018c2cc5c13dcea48fa3b963db8ac5cb0ee5ff36bf"} Apr 22 14:33:46.834898 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.834549 2578 scope.go:117] "RemoveContainer" containerID="0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a" Apr 22 14:33:46.843273 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.843253 2578 scope.go:117] "RemoveContainer" containerID="fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34" Apr 22 14:33:46.850940 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.850921 2578 scope.go:117] "RemoveContainer" containerID="05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606" Apr 22 14:33:46.856770 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.856747 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp"] Apr 22 14:33:46.859211 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.859193 2578 scope.go:117] "RemoveContainer" containerID="0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a" Apr 22 14:33:46.859589 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:33:46.859567 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a\": container with ID starting with 0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a not found: ID does not exist" containerID="0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a" Apr 22 14:33:46.859678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.859596 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a"} err="failed to get container status \"0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a\": rpc error: code = NotFound desc = could not find container \"0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a\": container with ID starting with 0a57fb81f2c31d785c9bac9f6a04ea767576083c91bad346cf9f509874a5ac4a not found: ID does not exist" Apr 22 14:33:46.859678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.859615 2578 scope.go:117] "RemoveContainer" containerID="fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34" Apr 22 14:33:46.859857 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:33:46.859840 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34\": container with ID starting with fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34 not found: ID does not exist" containerID="fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34" Apr 22 14:33:46.859901 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.859853 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-router-scheduler-977d94f455fcp"] Apr 22 14:33:46.859901 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.859863 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34"} err="failed to get container status \"fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34\": rpc error: code = NotFound desc = could not find container \"fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34\": container with ID starting with fa520ad3af78dd40810fdca5a5ae492e08c5a11df82505114d9e3a018fc27c34 not found: ID does not exist" Apr 22 14:33:46.859901 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.859881 2578 scope.go:117] "RemoveContainer" containerID="05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606" Apr 22 14:33:46.860142 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:33:46.860125 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606\": container with ID starting with 05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606 not found: ID does not exist" containerID="05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606" Apr 22 14:33:46.860179 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:46.860149 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606"} err="failed to get container status \"05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606\": rpc error: code = NotFound desc = could not find container \"05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606\": container with ID starting with 05c46085648ef6403174fbc0f8b804a3e20a381fb82bfeb4989943a456862606 not found: ID does not exist" Apr 22 14:33:47.951670 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:33:47.951638 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" path="/var/lib/kubelet/pods/c86147a5-270f-4580-b2a6-88aa5514adf4/volumes" Apr 22 14:34:08.738628 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.738589 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7"] Apr 22 14:34:08.739045 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739031 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="storage-initializer" Apr 22 14:34:08.739091 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739049 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="storage-initializer" Apr 22 14:34:08.739091 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739067 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="main" Apr 22 14:34:08.739091 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739076 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="main" Apr 22 14:34:08.739202 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739091 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="tokenizer" Apr 22 14:34:08.739202 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739100 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="tokenizer" Apr 22 14:34:08.739202 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739168 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="main" Apr 22 14:34:08.739202 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.739184 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c86147a5-270f-4580-b2a6-88aa5514adf4" containerName="tokenizer" Apr 22 14:34:08.743923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.743903 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.746661 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.746640 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-epp-sa-dockercfg-tqz45\"" Apr 22 14:34:08.746754 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.746700 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 22 14:34:08.752482 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.752454 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7"] Apr 22 14:34:08.898491 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.898454 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.898679 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.898506 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484f6ee2-18cc-4c90-a148-426b45734624-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.898679 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.898570 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.898679 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.898637 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.898679 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.898668 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-nlq2l\" (UniqueName: \"kubernetes.io/projected/484f6ee2-18cc-4c90-a148-426b45734624-kube-api-access-nlq2l\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.898812 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.898730 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999601 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999509 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999601 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999552 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484f6ee2-18cc-4c90-a148-426b45734624-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999601 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999574 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999623 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999649 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-nlq2l\" (UniqueName: \"kubernetes.io/projected/484f6ee2-18cc-4c90-a148-426b45734624-kube-api-access-nlq2l\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999677 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:08.999999 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:08.999970 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-tmp\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.000038 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.000021 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-cache\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.000078 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.000060 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-uds\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.000114 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.000078 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-kserve-provision-location\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.002109 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.002091 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484f6ee2-18cc-4c90-a148-426b45734624-tls-certs\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.010812 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.010786 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-nlq2l\" (UniqueName: \"kubernetes.io/projected/484f6ee2-18cc-4c90-a148-426b45734624-kube-api-access-nlq2l\") pod \"router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.054480 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.054444 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:09.180262 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.180229 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7"] Apr 22 14:34:09.183152 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:34:09.183118 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod484f6ee2_18cc_4c90_a148_426b45734624.slice/crio-4fef0feb6d4ec0ae51245dcc88e5ac342e5cbdc72112a149483216e5357ad4d6 WatchSource:0}: Error finding container 4fef0feb6d4ec0ae51245dcc88e5ac342e5cbdc72112a149483216e5357ad4d6: Status 404 returned error can't find the container with id 4fef0feb6d4ec0ae51245dcc88e5ac342e5cbdc72112a149483216e5357ad4d6 Apr 22 14:34:09.922047 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.922009 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerStarted","Data":"7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08"} Apr 22 14:34:09.922047 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:09.922050 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerStarted","Data":"4fef0feb6d4ec0ae51245dcc88e5ac342e5cbdc72112a149483216e5357ad4d6"} Apr 22 14:34:10.927084 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:10.927043 2578 generic.go:358] "Generic (PLEG): container finished" podID="484f6ee2-18cc-4c90-a148-426b45734624" containerID="7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08" exitCode=0 Apr 22 14:34:10.927475 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:10.927126 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerDied","Data":"7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08"} Apr 22 14:34:11.932132 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:11.932092 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerStarted","Data":"ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800"} Apr 22 14:34:11.932132 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:11.932133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerStarted","Data":"eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa"} Apr 22 14:34:11.932566 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:11.932247 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:11.955731 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:11.955678 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" podStartSLOduration=3.955663247 podStartE2EDuration="3.955663247s" podCreationTimestamp="2026-04-22 14:34:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:34:11.954171912 +0000 UTC m=+1152.628975215" watchObservedRunningTime="2026-04-22 14:34:11.955663247 +0000 UTC m=+1152.630466512" Apr 22 14:34:19.055193 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:19.055151 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:19.055701 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:19.055208 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:19.057913 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:19.057884 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:19.961137 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:19.961105 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:40.965130 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:40.965095 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:34:57.510163 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:57.510123 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q"] Apr 22 14:34:57.510666 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:57.510464 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="main" containerID="cri-o://f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7" gracePeriod=30 Apr 22 14:34:57.510666 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:57.510511 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="tokenizer" containerID="cri-o://69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370" gracePeriod=30 Apr 22 14:34:57.723795 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:34:57.723758 2578 logging.go:55] [core] [Channel #311 SubChannel #312]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.40:9003", ServerName: "10.134.0.40:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.40:9003: connect: connection refused" Apr 22 14:34:58.093427 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.093393 2578 generic.go:358] "Generic (PLEG): container finished" podID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerID="f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7" exitCode=0 Apr 22 14:34:58.093608 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.093471 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerDied","Data":"f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7"} Apr 22 14:34:58.724450 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.724402 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.40:9003\" within 1s: context deadline exceeded" Apr 22 14:34:58.724800 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:34:58.724455 2578 logging.go:55] [core] [Channel #311 SubChannel #312]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.40:9003", ServerName: "10.134.0.40:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.40:9003: connect: connection refused" Apr 22 14:34:58.858183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.858157 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:34:58.922027 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.921996 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-cache\") pod \"15f358a2-34a9-4ba9-a347-e7354c0721c9\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " Apr 22 14:34:58.922027 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922036 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-uds\") pod \"15f358a2-34a9-4ba9-a347-e7354c0721c9\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " Apr 22 14:34:58.922277 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922089 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-kserve-provision-location\") pod \"15f358a2-34a9-4ba9-a347-e7354c0721c9\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " Apr 22 14:34:58.922277 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922128 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15f358a2-34a9-4ba9-a347-e7354c0721c9-tls-certs\") pod \"15f358a2-34a9-4ba9-a347-e7354c0721c9\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " Apr 22 14:34:58.922277 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922151 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-tmp\") pod \"15f358a2-34a9-4ba9-a347-e7354c0721c9\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " Apr 22 14:34:58.922277 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922177 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t6bkf\" (UniqueName: \"kubernetes.io/projected/15f358a2-34a9-4ba9-a347-e7354c0721c9-kube-api-access-t6bkf\") pod \"15f358a2-34a9-4ba9-a347-e7354c0721c9\" (UID: \"15f358a2-34a9-4ba9-a347-e7354c0721c9\") " Apr 22 14:34:58.922491 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922343 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "15f358a2-34a9-4ba9-a347-e7354c0721c9" (UID: "15f358a2-34a9-4ba9-a347-e7354c0721c9"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:58.922491 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922365 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "15f358a2-34a9-4ba9-a347-e7354c0721c9" (UID: "15f358a2-34a9-4ba9-a347-e7354c0721c9"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:58.922491 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922422 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:34:58.922589 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922534 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "15f358a2-34a9-4ba9-a347-e7354c0721c9" (UID: "15f358a2-34a9-4ba9-a347-e7354c0721c9"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:58.922865 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.922843 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "15f358a2-34a9-4ba9-a347-e7354c0721c9" (UID: "15f358a2-34a9-4ba9-a347-e7354c0721c9"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:34:58.924374 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.924276 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/15f358a2-34a9-4ba9-a347-e7354c0721c9-kube-api-access-t6bkf" (OuterVolumeSpecName: "kube-api-access-t6bkf") pod "15f358a2-34a9-4ba9-a347-e7354c0721c9" (UID: "15f358a2-34a9-4ba9-a347-e7354c0721c9"). InnerVolumeSpecName "kube-api-access-t6bkf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:34:58.924374 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:58.924290 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/15f358a2-34a9-4ba9-a347-e7354c0721c9-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "15f358a2-34a9-4ba9-a347-e7354c0721c9" (UID: "15f358a2-34a9-4ba9-a347-e7354c0721c9"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:34:59.023430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.023381 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/15f358a2-34a9-4ba9-a347-e7354c0721c9-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:34:59.023430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.023415 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:34:59.023430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.023430 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t6bkf\" (UniqueName: \"kubernetes.io/projected/15f358a2-34a9-4ba9-a347-e7354c0721c9-kube-api-access-t6bkf\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:34:59.023658 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.023443 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:34:59.023658 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.023458 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/15f358a2-34a9-4ba9-a347-e7354c0721c9-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:34:59.098878 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.098844 2578 generic.go:358] "Generic (PLEG): container finished" podID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerID="69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370" exitCode=0 Apr 22 14:34:59.099041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.098922 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" Apr 22 14:34:59.099041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.098919 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerDied","Data":"69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370"} Apr 22 14:34:59.099041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.099026 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q" event={"ID":"15f358a2-34a9-4ba9-a347-e7354c0721c9","Type":"ContainerDied","Data":"ab8c4c09a49c5461cb93623b8e3770e5ce2c928ef5ea723a0cada10c055988e5"} Apr 22 14:34:59.099041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.099042 2578 scope.go:117] "RemoveContainer" containerID="69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370" Apr 22 14:34:59.107882 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.107861 2578 scope.go:117] "RemoveContainer" containerID="f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7" Apr 22 14:34:59.115166 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.115149 2578 scope.go:117] "RemoveContainer" containerID="5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287" Apr 22 14:34:59.122227 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.122199 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q"] Apr 22 14:34:59.123923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.123904 2578 scope.go:117] "RemoveContainer" containerID="69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370" Apr 22 14:34:59.124193 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:34:59.124170 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370\": container with ID starting with 69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370 not found: ID does not exist" containerID="69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370" Apr 22 14:34:59.124360 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.124198 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370"} err="failed to get container status \"69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370\": rpc error: code = NotFound desc = could not find container \"69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370\": container with ID starting with 69b9d1e2fa274bcde4899e679e263920a97cffbd5f5a0d0ac4972fdc8af73370 not found: ID does not exist" Apr 22 14:34:59.124360 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.124217 2578 scope.go:117] "RemoveContainer" containerID="f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7" Apr 22 14:34:59.124546 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:34:59.124522 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7\": container with ID starting with f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7 not found: ID does not exist" containerID="f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7" Apr 22 14:34:59.124612 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.124592 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7"} err="failed to get container status \"f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7\": rpc error: code = NotFound desc = could not find container \"f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7\": container with ID starting with f5e15a09ba7437d0200a2c3fb0acf4b27eda3f532e6c41f253f952fb36a315a7 not found: ID does not exist" Apr 22 14:34:59.124668 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.124617 2578 scope.go:117] "RemoveContainer" containerID="5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287" Apr 22 14:34:59.124978 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:34:59.124944 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287\": container with ID starting with 5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287 not found: ID does not exist" containerID="5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287" Apr 22 14:34:59.125080 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.124984 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287"} err="failed to get container status \"5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287\": rpc error: code = NotFound desc = could not find container \"5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287\": container with ID starting with 5cca08b938c27897566022290e0aea15a9529d79915bc4917416b0f4e925f287 not found: ID does not exist" Apr 22 14:34:59.126928 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.126906 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-router-scheduler-57d5754c4-8c96q"] Apr 22 14:34:59.909857 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.909829 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:34:59.918407 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.918382 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:34:59.951943 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:34:59.951907 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" path="/var/lib/kubelet/pods/15f358a2-34a9-4ba9-a347-e7354c0721c9/volumes" Apr 22 14:36:00.292635 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:00.292548 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7"] Apr 22 14:36:00.293076 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:00.292980 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="main" containerID="cri-o://eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa" gracePeriod=30 Apr 22 14:36:00.293076 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:00.293012 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="tokenizer" containerID="cri-o://ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800" gracePeriod=30 Apr 22 14:36:00.963978 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:36:00.963945 2578 logging.go:55] [core] [Channel #352 SubChannel #353]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.41:9003", ServerName: "10.134.0.41:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.41:9003: connect: connection refused" Apr 22 14:36:01.311292 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.311194 2578 generic.go:358] "Generic (PLEG): container finished" podID="484f6ee2-18cc-4c90-a148-426b45734624" containerID="eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa" exitCode=0 Apr 22 14:36:01.311292 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.311236 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerDied","Data":"eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa"} Apr 22 14:36:01.570417 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.570349 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:36:01.658891 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.658844 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-kserve-provision-location\") pod \"484f6ee2-18cc-4c90-a148-426b45734624\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " Apr 22 14:36:01.659082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.658903 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-uds\") pod \"484f6ee2-18cc-4c90-a148-426b45734624\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " Apr 22 14:36:01.659082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.658924 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-cache\") pod \"484f6ee2-18cc-4c90-a148-426b45734624\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " Apr 22 14:36:01.659082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.658949 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-tmp\") pod \"484f6ee2-18cc-4c90-a148-426b45734624\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " Apr 22 14:36:01.659082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.658973 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-nlq2l\" (UniqueName: \"kubernetes.io/projected/484f6ee2-18cc-4c90-a148-426b45734624-kube-api-access-nlq2l\") pod \"484f6ee2-18cc-4c90-a148-426b45734624\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " Apr 22 14:36:01.659082 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.658998 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484f6ee2-18cc-4c90-a148-426b45734624-tls-certs\") pod \"484f6ee2-18cc-4c90-a148-426b45734624\" (UID: \"484f6ee2-18cc-4c90-a148-426b45734624\") " Apr 22 14:36:01.659373 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.659185 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "484f6ee2-18cc-4c90-a148-426b45734624" (UID: "484f6ee2-18cc-4c90-a148-426b45734624"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:01.659373 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.659349 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "484f6ee2-18cc-4c90-a148-426b45734624" (UID: "484f6ee2-18cc-4c90-a148-426b45734624"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:01.659517 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.659498 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "484f6ee2-18cc-4c90-a148-426b45734624" (UID: "484f6ee2-18cc-4c90-a148-426b45734624"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:01.660058 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.660028 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "484f6ee2-18cc-4c90-a148-426b45734624" (UID: "484f6ee2-18cc-4c90-a148-426b45734624"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:36:01.661200 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.661178 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/484f6ee2-18cc-4c90-a148-426b45734624-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "484f6ee2-18cc-4c90-a148-426b45734624" (UID: "484f6ee2-18cc-4c90-a148-426b45734624"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:36:01.661271 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.661245 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/484f6ee2-18cc-4c90-a148-426b45734624-kube-api-access-nlq2l" (OuterVolumeSpecName: "kube-api-access-nlq2l") pod "484f6ee2-18cc-4c90-a148-426b45734624" (UID: "484f6ee2-18cc-4c90-a148-426b45734624"). InnerVolumeSpecName "kube-api-access-nlq2l". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:36:01.760146 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.760109 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:36:01.760146 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.760142 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:36:01.760146 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.760153 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:36:01.760403 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.760165 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/484f6ee2-18cc-4c90-a148-426b45734624-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:36:01.760403 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.760175 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-nlq2l\" (UniqueName: \"kubernetes.io/projected/484f6ee2-18cc-4c90-a148-426b45734624-kube-api-access-nlq2l\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:36:01.760403 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.760185 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/484f6ee2-18cc-4c90-a148-426b45734624-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:36:01.964420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:01.964381 2578 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.41:9003\" within 1s: context deadline exceeded" Apr 22 14:36:02.316538 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.316441 2578 generic.go:358] "Generic (PLEG): container finished" podID="484f6ee2-18cc-4c90-a148-426b45734624" containerID="ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800" exitCode=0 Apr 22 14:36:02.316538 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.316510 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" Apr 22 14:36:02.316538 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.316525 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerDied","Data":"ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800"} Apr 22 14:36:02.317057 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.316573 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7" event={"ID":"484f6ee2-18cc-4c90-a148-426b45734624","Type":"ContainerDied","Data":"4fef0feb6d4ec0ae51245dcc88e5ac342e5cbdc72112a149483216e5357ad4d6"} Apr 22 14:36:02.317057 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.316594 2578 scope.go:117] "RemoveContainer" containerID="ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800" Apr 22 14:36:02.324949 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.324923 2578 scope.go:117] "RemoveContainer" containerID="eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa" Apr 22 14:36:02.332268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.332238 2578 scope.go:117] "RemoveContainer" containerID="7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08" Apr 22 14:36:02.335218 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.335195 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7"] Apr 22 14:36:02.342239 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.342216 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-router-scheduler-9b96b9744-clcz7"] Apr 22 14:36:02.342728 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.342713 2578 scope.go:117] "RemoveContainer" containerID="ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800" Apr 22 14:36:02.342993 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:36:02.342974 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800\": container with ID starting with ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800 not found: ID does not exist" containerID="ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800" Apr 22 14:36:02.343041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.343005 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800"} err="failed to get container status \"ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800\": rpc error: code = NotFound desc = could not find container \"ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800\": container with ID starting with ff0f3fccffd1a1f1c5b63cf468dc2cad6a66ccc9033f42754ce3d0a897b8a800 not found: ID does not exist" Apr 22 14:36:02.343041 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.343024 2578 scope.go:117] "RemoveContainer" containerID="eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa" Apr 22 14:36:02.343250 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:36:02.343230 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa\": container with ID starting with eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa not found: ID does not exist" containerID="eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa" Apr 22 14:36:02.343287 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.343256 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa"} err="failed to get container status \"eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa\": rpc error: code = NotFound desc = could not find container \"eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa\": container with ID starting with eb311137b060e4f71ea9dd21b4da29908ecbb4e39585cc011b18acf572a9bdfa not found: ID does not exist" Apr 22 14:36:02.343287 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.343272 2578 scope.go:117] "RemoveContainer" containerID="7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08" Apr 22 14:36:02.343518 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:36:02.343501 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08\": container with ID starting with 7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08 not found: ID does not exist" containerID="7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08" Apr 22 14:36:02.343570 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:02.343522 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08"} err="failed to get container status \"7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08\": rpc error: code = NotFound desc = could not find container \"7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08\": container with ID starting with 7633373ca27ff21d9b0d673f61e16aed332ca84d8e943ba03a2710a8d9cadf08 not found: ID does not exist" Apr 22 14:36:03.952263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:03.952222 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="484f6ee2-18cc-4c90-a148-426b45734624" path="/var/lib/kubelet/pods/484f6ee2-18cc-4c90-a148-426b45734624/volumes" Apr 22 14:36:16.360003 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.359970 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk"] Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360313 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="main" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360326 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="main" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360350 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="tokenizer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360359 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="tokenizer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360366 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="storage-initializer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360372 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="storage-initializer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360379 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="storage-initializer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360384 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="storage-initializer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360390 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="main" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360395 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="main" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360405 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="tokenizer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360410 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="tokenizer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360463 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="tokenizer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360470 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="main" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360477 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="484f6ee2-18cc-4c90-a148-426b45734624" containerName="tokenizer" Apr 22 14:36:16.360527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.360483 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="15f358a2-34a9-4ba9-a347-e7354c0721c9" containerName="main" Apr 22 14:36:16.365573 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.365538 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.368294 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.368266 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-q979b\"" Apr 22 14:36:16.369433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.369371 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-epp-sa-dockercfg-nfm74\"" Apr 22 14:36:16.369433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.369394 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 22 14:36:16.369433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.369395 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 22 14:36:16.369651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.369394 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 22 14:36:16.379562 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.379537 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk"] Apr 22 14:36:16.481527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.481490 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.481527 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.481534 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c523273f-bdec-4c81-a56c-7f74548814e0-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.481740 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.481552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.481740 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.481626 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.481740 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.481670 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.481740 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.481701 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lffg9\" (UniqueName: \"kubernetes.io/projected/c523273f-bdec-4c81-a56c-7f74548814e0-kube-api-access-lffg9\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583054 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583017 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583054 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c523273f-bdec-4c81-a56c-7f74548814e0-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583083 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583197 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583251 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583516 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583368 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lffg9\" (UniqueName: \"kubernetes.io/projected/c523273f-bdec-4c81-a56c-7f74548814e0-kube-api-access-lffg9\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583516 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583501 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-cache\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583631 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583539 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-kserve-provision-location\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583631 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583564 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-tmp\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.583725 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.583641 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-uds\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.585651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.585633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c523273f-bdec-4c81-a56c-7f74548814e0-tls-certs\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.591679 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.591651 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lffg9\" (UniqueName: \"kubernetes.io/projected/c523273f-bdec-4c81-a56c-7f74548814e0-kube-api-access-lffg9\") pod \"llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.676488 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.676447 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:16.803132 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:16.803077 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk"] Apr 22 14:36:16.805308 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:36:16.805268 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc523273f_bdec_4c81_a56c_7f74548814e0.slice/crio-e1542a4e312cc435f54b0756658a5cbddadd5c952885b36d6d3ff5faf35eeb1a WatchSource:0}: Error finding container e1542a4e312cc435f54b0756658a5cbddadd5c952885b36d6d3ff5faf35eeb1a: Status 404 returned error can't find the container with id e1542a4e312cc435f54b0756658a5cbddadd5c952885b36d6d3ff5faf35eeb1a Apr 22 14:36:17.369833 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:17.369794 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerStarted","Data":"5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28"} Apr 22 14:36:17.370219 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:17.369850 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerStarted","Data":"e1542a4e312cc435f54b0756658a5cbddadd5c952885b36d6d3ff5faf35eeb1a"} Apr 22 14:36:18.375073 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:18.375031 2578 generic.go:358] "Generic (PLEG): container finished" podID="c523273f-bdec-4c81-a56c-7f74548814e0" containerID="5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28" exitCode=0 Apr 22 14:36:18.375486 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:18.375123 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerDied","Data":"5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28"} Apr 22 14:36:19.380386 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:19.380350 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerStarted","Data":"36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9"} Apr 22 14:36:19.380386 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:19.380387 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerStarted","Data":"5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f"} Apr 22 14:36:19.380804 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:19.380524 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:19.402786 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:19.402729 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" podStartSLOduration=3.402715343 podStartE2EDuration="3.402715343s" podCreationTimestamp="2026-04-22 14:36:16 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:36:19.401470489 +0000 UTC m=+1280.076273766" watchObservedRunningTime="2026-04-22 14:36:19.402715343 +0000 UTC m=+1280.077518607" Apr 22 14:36:26.676654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:26.676607 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:26.676654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:26.676651 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:26.679358 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:26.679332 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:27.409779 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:27.409749 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:36:48.413617 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:36:48.413587 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:39:59.941309 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:39:59.941267 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:39:59.947183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:39:59.947154 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:40:09.473285 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.473238 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 14:40:09.477336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.477310 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.485591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.480314 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-zwtrk\"" Apr 22 14:40:09.485591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.481092 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 22 14:40:09.488821 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.488790 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 14:40:09.544664 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.544630 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg"] Apr 22 14:40:09.548126 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.548108 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.550896 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.550873 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5ec-epp-sa-dockercfg-tnk7r\"" Apr 22 14:40:09.551377 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.551348 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.551499 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.551420 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-zcd7h\" (UniqueName: \"kubernetes.io/projected/8ed11f5d-245b-4836-81b0-63f2109c1997-kube-api-access-zcd7h\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.551499 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.551459 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed11f5d-245b-4836-81b0-63f2109c1997-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.551499 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.551485 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.551668 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.551549 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.551668 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.551616 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.561579 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.561551 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg"] Apr 22 14:40:09.652184 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652148 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.652420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652190 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.652420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652210 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.652420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652239 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-zcd7h\" (UniqueName: \"kubernetes.io/projected/8ed11f5d-245b-4836-81b0-63f2109c1997-kube-api-access-zcd7h\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652263 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed11f5d-245b-4836-81b0-63f2109c1997-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652282 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652420 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652329 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cpq7f\" (UniqueName: \"kubernetes.io/projected/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kube-api-access-cpq7f\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.652711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652429 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652482 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652518 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.652711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652552 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.652711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652587 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652711 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652912 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652744 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.652912 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.652844 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.654688 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.654669 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.654803 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.654787 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed11f5d-245b-4836-81b0-63f2109c1997-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.660475 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.660448 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-zcd7h\" (UniqueName: \"kubernetes.io/projected/8ed11f5d-245b-4836-81b0-63f2109c1997-kube-api-access-zcd7h\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.753943 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.753836 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.753943 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.753891 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.753943 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.753917 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754239 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.753978 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cpq7f\" (UniqueName: \"kubernetes.io/projected/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kube-api-access-cpq7f\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754239 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.754041 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754239 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.754072 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754448 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.754376 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-uds\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754448 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.754387 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-cache\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754547 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.754446 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kserve-provision-location\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.754547 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.754484 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-tmp\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.756437 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.756420 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tls-certs\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.762164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.762138 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cpq7f\" (UniqueName: \"kubernetes.io/projected/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kube-api-access-cpq7f\") pod \"llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.795930 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.795894 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:40:09.858192 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.858142 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:09.931336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.931252 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 14:40:09.935203 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:40:09.935172 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8ed11f5d_245b_4836_81b0_63f2109c1997.slice/crio-ba66de5da075db52db0ebae1c52271a82972e914f452b86b70099b3fae3837ca WatchSource:0}: Error finding container ba66de5da075db52db0ebae1c52271a82972e914f452b86b70099b3fae3837ca: Status 404 returned error can't find the container with id ba66de5da075db52db0ebae1c52271a82972e914f452b86b70099b3fae3837ca Apr 22 14:40:09.937380 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:09.937359 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:40:10.015165 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:10.015143 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg"] Apr 22 14:40:10.017235 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:40:10.017199 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23f9f75d_9858_4ac9_b12c_b497d85a48ca.slice/crio-53cca9f55c7c165579cf0c809ece9987383bceec2342b0aaab97d3965bec3a0d WatchSource:0}: Error finding container 53cca9f55c7c165579cf0c809ece9987383bceec2342b0aaab97d3965bec3a0d: Status 404 returned error can't find the container with id 53cca9f55c7c165579cf0c809ece9987383bceec2342b0aaab97d3965bec3a0d Apr 22 14:40:10.154871 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:10.154796 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ed11f5d-245b-4836-81b0-63f2109c1997","Type":"ContainerStarted","Data":"e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1"} Apr 22 14:40:10.154871 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:10.154843 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ed11f5d-245b-4836-81b0-63f2109c1997","Type":"ContainerStarted","Data":"ba66de5da075db52db0ebae1c52271a82972e914f452b86b70099b3fae3837ca"} Apr 22 14:40:10.156291 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:10.156256 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerStarted","Data":"44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80"} Apr 22 14:40:10.156442 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:10.156314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerStarted","Data":"53cca9f55c7c165579cf0c809ece9987383bceec2342b0aaab97d3965bec3a0d"} Apr 22 14:40:11.162270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:11.162213 2578 generic.go:358] "Generic (PLEG): container finished" podID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerID="44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80" exitCode=0 Apr 22 14:40:11.162270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:11.162314 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerDied","Data":"44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80"} Apr 22 14:40:11.589885 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:11.589793 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk"] Apr 22 14:40:11.590162 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:11.590122 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="main" containerID="cri-o://5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f" gracePeriod=30 Apr 22 14:40:11.590320 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:11.590187 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="tokenizer" containerID="cri-o://36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9" gracePeriod=30 Apr 22 14:40:12.168362 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:12.168329 2578 generic.go:358] "Generic (PLEG): container finished" podID="c523273f-bdec-4c81-a56c-7f74548814e0" containerID="5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f" exitCode=0 Apr 22 14:40:12.168828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:12.168407 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerDied","Data":"5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f"} Apr 22 14:40:12.170585 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:12.170560 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerStarted","Data":"906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5"} Apr 22 14:40:12.170709 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:12.170590 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerStarted","Data":"2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123"} Apr 22 14:40:12.170755 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:12.170733 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:12.216682 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:12.216622 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" podStartSLOduration=3.21659919 podStartE2EDuration="3.21659919s" podCreationTimestamp="2026-04-22 14:40:09 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:40:12.215712403 +0000 UTC m=+1512.890515671" watchObservedRunningTime="2026-04-22 14:40:12.21659919 +0000 UTC m=+1512.891402456" Apr 22 14:40:13.072222 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.072194 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:40:13.177106 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.177067 2578 generic.go:358] "Generic (PLEG): container finished" podID="c523273f-bdec-4c81-a56c-7f74548814e0" containerID="36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9" exitCode=0 Apr 22 14:40:13.177106 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.177151 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerDied","Data":"36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9"} Apr 22 14:40:13.177758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.177165 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" Apr 22 14:40:13.177758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.177193 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk" event={"ID":"c523273f-bdec-4c81-a56c-7f74548814e0","Type":"ContainerDied","Data":"e1542a4e312cc435f54b0756658a5cbddadd5c952885b36d6d3ff5faf35eeb1a"} Apr 22 14:40:13.177758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.177214 2578 scope.go:117] "RemoveContainer" containerID="36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9" Apr 22 14:40:13.186003 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.185983 2578 scope.go:117] "RemoveContainer" containerID="5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f" Apr 22 14:40:13.188289 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188262 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c523273f-bdec-4c81-a56c-7f74548814e0-tls-certs\") pod \"c523273f-bdec-4c81-a56c-7f74548814e0\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " Apr 22 14:40:13.188433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188315 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-kserve-provision-location\") pod \"c523273f-bdec-4c81-a56c-7f74548814e0\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " Apr 22 14:40:13.188433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188368 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-uds\") pod \"c523273f-bdec-4c81-a56c-7f74548814e0\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " Apr 22 14:40:13.188433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188398 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lffg9\" (UniqueName: \"kubernetes.io/projected/c523273f-bdec-4c81-a56c-7f74548814e0-kube-api-access-lffg9\") pod \"c523273f-bdec-4c81-a56c-7f74548814e0\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " Apr 22 14:40:13.188608 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188464 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-tmp\") pod \"c523273f-bdec-4c81-a56c-7f74548814e0\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " Apr 22 14:40:13.188608 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188494 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-cache\") pod \"c523273f-bdec-4c81-a56c-7f74548814e0\" (UID: \"c523273f-bdec-4c81-a56c-7f74548814e0\") " Apr 22 14:40:13.188989 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188950 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "c523273f-bdec-4c81-a56c-7f74548814e0" (UID: "c523273f-bdec-4c81-a56c-7f74548814e0"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:13.188989 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.188962 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "c523273f-bdec-4c81-a56c-7f74548814e0" (UID: "c523273f-bdec-4c81-a56c-7f74548814e0"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:13.189149 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.189035 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "c523273f-bdec-4c81-a56c-7f74548814e0" (UID: "c523273f-bdec-4c81-a56c-7f74548814e0"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:13.189705 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.189680 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "c523273f-bdec-4c81-a56c-7f74548814e0" (UID: "c523273f-bdec-4c81-a56c-7f74548814e0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:40:13.191051 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.191028 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/c523273f-bdec-4c81-a56c-7f74548814e0-kube-api-access-lffg9" (OuterVolumeSpecName: "kube-api-access-lffg9") pod "c523273f-bdec-4c81-a56c-7f74548814e0" (UID: "c523273f-bdec-4c81-a56c-7f74548814e0"). InnerVolumeSpecName "kube-api-access-lffg9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:40:13.191537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.191508 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/c523273f-bdec-4c81-a56c-7f74548814e0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "c523273f-bdec-4c81-a56c-7f74548814e0" (UID: "c523273f-bdec-4c81-a56c-7f74548814e0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:40:13.218569 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.218537 2578 scope.go:117] "RemoveContainer" containerID="5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28" Apr 22 14:40:13.227468 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.227385 2578 scope.go:117] "RemoveContainer" containerID="36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9" Apr 22 14:40:13.227893 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:40:13.227831 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9\": container with ID starting with 36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9 not found: ID does not exist" containerID="36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9" Apr 22 14:40:13.228006 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.227897 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9"} err="failed to get container status \"36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9\": rpc error: code = NotFound desc = could not find container \"36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9\": container with ID starting with 36e60b0e4a82e83671cf68516941a35155e9e8c9c6aecaa791808248babe0ff9 not found: ID does not exist" Apr 22 14:40:13.228006 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.227951 2578 scope.go:117] "RemoveContainer" containerID="5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f" Apr 22 14:40:13.228357 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:40:13.228333 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f\": container with ID starting with 5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f not found: ID does not exist" containerID="5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f" Apr 22 14:40:13.228465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.228360 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f"} err="failed to get container status \"5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f\": rpc error: code = NotFound desc = could not find container \"5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f\": container with ID starting with 5392f59850ab48530033b37029911ff9956cbb52d82e992642673274cbe69a1f not found: ID does not exist" Apr 22 14:40:13.228465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.228376 2578 scope.go:117] "RemoveContainer" containerID="5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28" Apr 22 14:40:13.228695 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:40:13.228665 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28\": container with ID starting with 5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28 not found: ID does not exist" containerID="5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28" Apr 22 14:40:13.228769 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.228705 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28"} err="failed to get container status \"5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28\": rpc error: code = NotFound desc = could not find container \"5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28\": container with ID starting with 5f4c119861f41f3b62a1bce3a40d6f1d620bd7f28801aab1f96afdd403b8ea28 not found: ID does not exist" Apr 22 14:40:13.289234 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.289190 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/c523273f-bdec-4c81-a56c-7f74548814e0-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:40:13.289234 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.289230 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:40:13.289465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.289245 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:40:13.289465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.289260 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lffg9\" (UniqueName: \"kubernetes.io/projected/c523273f-bdec-4c81-a56c-7f74548814e0-kube-api-access-lffg9\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:40:13.289465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.289273 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:40:13.289465 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.289286 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/c523273f-bdec-4c81-a56c-7f74548814e0-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:40:13.500102 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.500064 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk"] Apr 22 14:40:13.504189 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.504161 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc8f1a6f044e8c7a4d31a250e0c4861caf-kserve-router-scheb6phk"] Apr 22 14:40:13.953616 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:13.953578 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" path="/var/lib/kubelet/pods/c523273f-bdec-4c81-a56c-7f74548814e0/volumes" Apr 22 14:40:15.188836 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:15.188801 2578 generic.go:358] "Generic (PLEG): container finished" podID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerID="e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1" exitCode=0 Apr 22 14:40:15.189213 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:15.188864 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ed11f5d-245b-4836-81b0-63f2109c1997","Type":"ContainerDied","Data":"e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1"} Apr 22 14:40:19.858517 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:19.858469 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:19.858517 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:19.858521 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:19.860920 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:19.860412 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.44:8082/healthz\": dial tcp 10.134.0.44:8082: connect: connection refused" Apr 22 14:40:26.220265 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220183 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr"] Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220708 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="main" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220729 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="main" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220768 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="storage-initializer" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220781 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="storage-initializer" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220793 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="tokenizer" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220802 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="tokenizer" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220907 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="tokenizer" Apr 22 14:40:26.221094 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.220922 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="c523273f-bdec-4c81-a56c-7f74548814e0" containerName="main" Apr 22 14:40:26.414714 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.414673 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr"] Apr 22 14:40:26.414910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.414870 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.417768 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.417741 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 22 14:40:26.417905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.417794 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-epp-sa-dockercfg-jcfpz\"" Apr 22 14:40:26.528052 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.527944 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.528052 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.528011 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-xm9sh\" (UniqueName: \"kubernetes.io/projected/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kube-api-access-xm9sh\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.528406 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.528065 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.528406 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.528122 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.528406 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.528164 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.528406 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.528205 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629108 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629065 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629111 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-xm9sh\" (UniqueName: \"kubernetes.io/projected/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kube-api-access-xm9sh\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629151 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629228 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629264 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629336 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629319 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629659 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629633 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-uds\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629723 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629675 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-tmp\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629918 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629861 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-cache\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.629918 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.629896 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.632717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.632695 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.637750 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.637702 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-xm9sh\" (UniqueName: \"kubernetes.io/projected/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kube-api-access-xm9sh\") pod \"custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:26.728254 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:26.727779 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:28.204057 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:28.204023 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr"] Apr 22 14:40:28.206320 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:40:28.206267 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f11ebe1_fff6_4c8d_aef9_19b64b34e99e.slice/crio-41dc2f21711f6fb2b3c54edd38c5a1bf2c1e1fcdbf2175b2e5fcac7e40b37588 WatchSource:0}: Error finding container 41dc2f21711f6fb2b3c54edd38c5a1bf2c1e1fcdbf2175b2e5fcac7e40b37588: Status 404 returned error can't find the container with id 41dc2f21711f6fb2b3c54edd38c5a1bf2c1e1fcdbf2175b2e5fcac7e40b37588 Apr 22 14:40:28.253358 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:28.253292 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerStarted","Data":"41dc2f21711f6fb2b3c54edd38c5a1bf2c1e1fcdbf2175b2e5fcac7e40b37588"} Apr 22 14:40:29.259581 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:29.259543 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerStarted","Data":"24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c"} Apr 22 14:40:29.860026 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:29.859995 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:29.861431 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:29.861399 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:30.265020 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:30.264968 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerID="24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c" exitCode=0 Apr 22 14:40:30.265494 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:30.265140 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerDied","Data":"24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c"} Apr 22 14:40:42.311764 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:42.311725 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerStarted","Data":"ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5"} Apr 22 14:40:42.311764 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:42.311766 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerStarted","Data":"b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770"} Apr 22 14:40:42.312200 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:42.311864 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:42.333202 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:42.333140 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" podStartSLOduration=16.333110494 podStartE2EDuration="16.333110494s" podCreationTimestamp="2026-04-22 14:40:26 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:40:42.331136524 +0000 UTC m=+1543.005939790" watchObservedRunningTime="2026-04-22 14:40:42.333110494 +0000 UTC m=+1543.007913761" Apr 22 14:40:43.320702 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:43.320664 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ed11f5d-245b-4836-81b0-63f2109c1997","Type":"ContainerStarted","Data":"972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6"} Apr 22 14:40:46.728974 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:46.728919 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:46.728974 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:46.728967 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:50.267235 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:50.267207 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:40:50.289729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:50.289660 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podStartSLOduration=13.931257839 podStartE2EDuration="41.289639377s" podCreationTimestamp="2026-04-22 14:40:09 +0000 UTC" firstStartedPulling="2026-04-22 14:40:15.190053752 +0000 UTC m=+1515.864856994" lastFinishedPulling="2026-04-22 14:40:42.54843528 +0000 UTC m=+1543.223238532" observedRunningTime="2026-04-22 14:40:43.340514509 +0000 UTC m=+1544.015317777" watchObservedRunningTime="2026-04-22 14:40:50.289639377 +0000 UTC m=+1550.964442648" Apr 22 14:40:56.731980 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:56.731948 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:40:56.733244 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:40:56.733221 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:41:17.371678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:41:17.371648 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:43:27.539747 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:27.539638 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg"] Apr 22 14:43:27.540348 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:27.540022 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="main" containerID="cri-o://2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123" gracePeriod=30 Apr 22 14:43:27.540348 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:27.540077 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="tokenizer" containerID="cri-o://906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5" gracePeriod=30 Apr 22 14:43:27.908362 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:27.908286 2578 generic.go:358] "Generic (PLEG): container finished" podID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerID="2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123" exitCode=0 Apr 22 14:43:27.908537 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:27.908362 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerDied","Data":"2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123"} Apr 22 14:43:28.883883 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.883857 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:43:28.913946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.913854 2578 generic.go:358] "Generic (PLEG): container finished" podID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerID="906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5" exitCode=0 Apr 22 14:43:28.913946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.913929 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" Apr 22 14:43:28.914131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.913927 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerDied","Data":"906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5"} Apr 22 14:43:28.914131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.914046 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg" event={"ID":"23f9f75d-9858-4ac9-b12c-b497d85a48ca","Type":"ContainerDied","Data":"53cca9f55c7c165579cf0c809ece9987383bceec2342b0aaab97d3965bec3a0d"} Apr 22 14:43:28.914131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.914071 2578 scope.go:117] "RemoveContainer" containerID="906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5" Apr 22 14:43:28.922572 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.922554 2578 scope.go:117] "RemoveContainer" containerID="2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123" Apr 22 14:43:28.929951 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.929928 2578 scope.go:117] "RemoveContainer" containerID="44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80" Apr 22 14:43:28.937000 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.936983 2578 scope.go:117] "RemoveContainer" containerID="906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5" Apr 22 14:43:28.937247 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:28.937228 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5\": container with ID starting with 906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5 not found: ID does not exist" containerID="906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5" Apr 22 14:43:28.937346 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.937260 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5"} err="failed to get container status \"906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5\": rpc error: code = NotFound desc = could not find container \"906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5\": container with ID starting with 906ce0475ba1dde9d436ddc5424aefc02fe1777506c966ece5bbf067b679bea5 not found: ID does not exist" Apr 22 14:43:28.937346 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.937286 2578 scope.go:117] "RemoveContainer" containerID="2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123" Apr 22 14:43:28.937551 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:28.937533 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123\": container with ID starting with 2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123 not found: ID does not exist" containerID="2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123" Apr 22 14:43:28.937592 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.937557 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123"} err="failed to get container status \"2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123\": rpc error: code = NotFound desc = could not find container \"2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123\": container with ID starting with 2f2d1ebfeb5609de9d274bc80899f1da7bbcb6f7edc2d80306263933184a8123 not found: ID does not exist" Apr 22 14:43:28.937592 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.937576 2578 scope.go:117] "RemoveContainer" containerID="44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80" Apr 22 14:43:28.937823 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:28.937805 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80\": container with ID starting with 44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80 not found: ID does not exist" containerID="44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80" Apr 22 14:43:28.937881 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:28.937831 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80"} err="failed to get container status \"44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80\": rpc error: code = NotFound desc = could not find container \"44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80\": container with ID starting with 44ed395e8e9a73767c83868de15a51d23d425150125b0ebcc365df4eb1a37b80 not found: ID does not exist" Apr 22 14:43:29.028496 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028466 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-tmp\") pod \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " Apr 22 14:43:29.028667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028509 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kserve-provision-location\") pod \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " Apr 22 14:43:29.028667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028532 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-cache\") pod \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " Apr 22 14:43:29.028667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028553 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-uds\") pod \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " Apr 22 14:43:29.028667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028595 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tls-certs\") pod \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " Apr 22 14:43:29.028667 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028640 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cpq7f\" (UniqueName: \"kubernetes.io/projected/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kube-api-access-cpq7f\") pod \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\" (UID: \"23f9f75d-9858-4ac9-b12c-b497d85a48ca\") " Apr 22 14:43:29.028939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028781 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "23f9f75d-9858-4ac9-b12c-b497d85a48ca" (UID: "23f9f75d-9858-4ac9-b12c-b497d85a48ca"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:29.028939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028854 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "23f9f75d-9858-4ac9-b12c-b497d85a48ca" (UID: "23f9f75d-9858-4ac9-b12c-b497d85a48ca"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:29.028939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.028892 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "23f9f75d-9858-4ac9-b12c-b497d85a48ca" (UID: "23f9f75d-9858-4ac9-b12c-b497d85a48ca"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:29.029720 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.029637 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.029720 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.029669 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.029720 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.029692 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.029959 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.029764 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "23f9f75d-9858-4ac9-b12c-b497d85a48ca" (UID: "23f9f75d-9858-4ac9-b12c-b497d85a48ca"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:29.036683 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.035931 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kube-api-access-cpq7f" (OuterVolumeSpecName: "kube-api-access-cpq7f") pod "23f9f75d-9858-4ac9-b12c-b497d85a48ca" (UID: "23f9f75d-9858-4ac9-b12c-b497d85a48ca"). InnerVolumeSpecName "kube-api-access-cpq7f". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:43:29.036683 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.035930 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "23f9f75d-9858-4ac9-b12c-b497d85a48ca" (UID: "23f9f75d-9858-4ac9-b12c-b497d85a48ca"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:43:29.130249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.130211 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/23f9f75d-9858-4ac9-b12c-b497d85a48ca-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.130249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.130240 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cpq7f\" (UniqueName: \"kubernetes.io/projected/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kube-api-access-cpq7f\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.130249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.130250 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/23f9f75d-9858-4ac9-b12c-b497d85a48ca-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:29.237582 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.237547 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg"] Apr 22 14:43:29.241252 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.241220 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc4e643bc258191ffc517a31cd1d0ddd27-kserve-router-schedkxhg"] Apr 22 14:43:29.602152 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.602061 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 14:43:29.602444 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.602416 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerName="main" containerID="cri-o://972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6" gracePeriod=30 Apr 22 14:43:29.952099 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:29.952069 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" path="/var/lib/kubelet/pods/23f9f75d-9858-4ac9-b12c-b497d85a48ca/volumes" Apr 22 14:43:30.357790 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.357765 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:43:30.442958 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.442926 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed11f5d-245b-4836-81b0-63f2109c1997-tls-certs\") pod \"8ed11f5d-245b-4836-81b0-63f2109c1997\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " Apr 22 14:43:30.442958 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.442962 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-kserve-provision-location\") pod \"8ed11f5d-245b-4836-81b0-63f2109c1997\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " Apr 22 14:43:30.443215 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.443016 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-model-cache\") pod \"8ed11f5d-245b-4836-81b0-63f2109c1997\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " Apr 22 14:43:30.443215 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.443035 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-home\") pod \"8ed11f5d-245b-4836-81b0-63f2109c1997\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " Apr 22 14:43:30.443215 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.443052 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-zcd7h\" (UniqueName: \"kubernetes.io/projected/8ed11f5d-245b-4836-81b0-63f2109c1997-kube-api-access-zcd7h\") pod \"8ed11f5d-245b-4836-81b0-63f2109c1997\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " Apr 22 14:43:30.443215 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.443076 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-dshm\") pod \"8ed11f5d-245b-4836-81b0-63f2109c1997\" (UID: \"8ed11f5d-245b-4836-81b0-63f2109c1997\") " Apr 22 14:43:30.443446 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.443368 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-model-cache" (OuterVolumeSpecName: "model-cache") pod "8ed11f5d-245b-4836-81b0-63f2109c1997" (UID: "8ed11f5d-245b-4836-81b0-63f2109c1997"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:30.443553 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.443472 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-home" (OuterVolumeSpecName: "home") pod "8ed11f5d-245b-4836-81b0-63f2109c1997" (UID: "8ed11f5d-245b-4836-81b0-63f2109c1997"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:30.445289 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.445265 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/8ed11f5d-245b-4836-81b0-63f2109c1997-kube-api-access-zcd7h" (OuterVolumeSpecName: "kube-api-access-zcd7h") pod "8ed11f5d-245b-4836-81b0-63f2109c1997" (UID: "8ed11f5d-245b-4836-81b0-63f2109c1997"). InnerVolumeSpecName "kube-api-access-zcd7h". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:43:30.445289 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.445277 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/8ed11f5d-245b-4836-81b0-63f2109c1997-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "8ed11f5d-245b-4836-81b0-63f2109c1997" (UID: "8ed11f5d-245b-4836-81b0-63f2109c1997"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:43:30.445461 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.445269 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-dshm" (OuterVolumeSpecName: "dshm") pod "8ed11f5d-245b-4836-81b0-63f2109c1997" (UID: "8ed11f5d-245b-4836-81b0-63f2109c1997"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:30.503210 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.503105 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "8ed11f5d-245b-4836-81b0-63f2109c1997" (UID: "8ed11f5d-245b-4836-81b0-63f2109c1997"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:30.544619 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.544583 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/8ed11f5d-245b-4836-81b0-63f2109c1997-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:30.544619 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.544615 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:30.544798 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.544631 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-model-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:30.544798 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.544643 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-home\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:30.544798 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.544654 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-zcd7h\" (UniqueName: \"kubernetes.io/projected/8ed11f5d-245b-4836-81b0-63f2109c1997-kube-api-access-zcd7h\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:30.544798 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.544665 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/8ed11f5d-245b-4836-81b0-63f2109c1997-dshm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:30.923044 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.923011 2578 generic.go:358] "Generic (PLEG): container finished" podID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerID="972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6" exitCode=0 Apr 22 14:43:30.923224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.923077 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" Apr 22 14:43:30.923224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.923098 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ed11f5d-245b-4836-81b0-63f2109c1997","Type":"ContainerDied","Data":"972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6"} Apr 22 14:43:30.923224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.923133 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1" event={"ID":"8ed11f5d-245b-4836-81b0-63f2109c1997","Type":"ContainerDied","Data":"ba66de5da075db52db0ebae1c52271a82972e914f452b86b70099b3fae3837ca"} Apr 22 14:43:30.923224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.923148 2578 scope.go:117] "RemoveContainer" containerID="972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6" Apr 22 14:43:30.944055 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.944010 2578 scope.go:117] "RemoveContainer" containerID="e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1" Apr 22 14:43:30.947473 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.947448 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 14:43:30.951864 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:30.951838 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0-1"] Apr 22 14:43:31.006520 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:31.006495 2578 scope.go:117] "RemoveContainer" containerID="972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6" Apr 22 14:43:31.006882 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:31.006831 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6\": container with ID starting with 972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6 not found: ID does not exist" containerID="972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6" Apr 22 14:43:31.006882 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:31.006860 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6"} err="failed to get container status \"972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6\": rpc error: code = NotFound desc = could not find container \"972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6\": container with ID starting with 972f3fe4e33bd0b62ada33bfa5b45e09e4a005101c7c763f46acb50d4e2ac2f6 not found: ID does not exist" Apr 22 14:43:31.006882 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:31.006882 2578 scope.go:117] "RemoveContainer" containerID="e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1" Apr 22 14:43:31.007178 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:31.007158 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1\": container with ID starting with e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1 not found: ID does not exist" containerID="e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1" Apr 22 14:43:31.007224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:31.007185 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1"} err="failed to get container status \"e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1\": rpc error: code = NotFound desc = could not find container \"e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1\": container with ID starting with e69d190e10aa1ded5e7d1b34540680ef13489e1f5a635b0dc240f0f39fddcad1 not found: ID does not exist" Apr 22 14:43:31.952214 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:31.952178 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" path="/var/lib/kubelet/pods/8ed11f5d-245b-4836-81b0-63f2109c1997/volumes" Apr 22 14:43:38.453729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.453696 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd"] Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454028 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="tokenizer" Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454038 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="tokenizer" Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454050 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerName="storage-initializer" Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454056 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerName="storage-initializer" Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454063 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="main" Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454071 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="main" Apr 22 14:43:38.454079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454083 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="storage-initializer" Apr 22 14:43:38.454316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454088 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="storage-initializer" Apr 22 14:43:38.454316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454095 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerName="main" Apr 22 14:43:38.454316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454099 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerName="main" Apr 22 14:43:38.454316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454160 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="main" Apr 22 14:43:38.454316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454169 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="23f9f75d-9858-4ac9-b12c-b497d85a48ca" containerName="tokenizer" Apr 22 14:43:38.454316 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.454176 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="8ed11f5d-245b-4836-81b0-63f2109c1997" containerName="main" Apr 22 14:43:38.457471 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.457448 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.460970 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.460949 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 22 14:43:38.464831 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.464806 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd"] Apr 22 14:43:38.612064 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.612024 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.612266 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.612073 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-model-cache\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.612266 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.612145 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-thhjs\" (UniqueName: \"kubernetes.io/projected/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kube-api-access-thhjs\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.612266 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.612207 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-dshm\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.612441 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.612271 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-home\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.612441 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.612355 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-tls-certs\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713617 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713520 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713617 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713569 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-model-cache\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713829 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713640 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-thhjs\" (UniqueName: \"kubernetes.io/projected/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kube-api-access-thhjs\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713829 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713691 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-dshm\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713829 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713756 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-home\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713829 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713796 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-tls-certs\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.713984 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.713971 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-model-cache\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.714144 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.714117 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-home\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.714337 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.714285 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.716065 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.716043 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-dshm\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.716224 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.716207 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-tls-certs\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.722962 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.722931 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-thhjs\" (UniqueName: \"kubernetes.io/projected/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kube-api-access-thhjs\") pod \"scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.770111 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.770075 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:38.892068 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.891909 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd"] Apr 22 14:43:38.894450 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:43:38.894407 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9a0950f9_5dd9_4e0a_96bf_1523cc3b0fdc.slice/crio-01bbf4c537ab4827c0b108b4766fbdefb6a0a107ef30d9b54183a6f38803eba1 WatchSource:0}: Error finding container 01bbf4c537ab4827c0b108b4766fbdefb6a0a107ef30d9b54183a6f38803eba1: Status 404 returned error can't find the container with id 01bbf4c537ab4827c0b108b4766fbdefb6a0a107ef30d9b54183a6f38803eba1 Apr 22 14:43:38.952317 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:38.952268 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" event={"ID":"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc","Type":"ContainerStarted","Data":"01bbf4c537ab4827c0b108b4766fbdefb6a0a107ef30d9b54183a6f38803eba1"} Apr 22 14:43:39.956165 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:39.956127 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" event={"ID":"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc","Type":"ContainerStarted","Data":"e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404"} Apr 22 14:43:43.976894 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:43.976857 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerID="e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404" exitCode=0 Apr 22 14:43:43.977294 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:43.976933 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" event={"ID":"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc","Type":"ContainerDied","Data":"e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404"} Apr 22 14:43:44.308692 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:44.308453 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr"] Apr 22 14:43:44.308920 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:44.308861 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="main" containerID="cri-o://b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770" gracePeriod=30 Apr 22 14:43:44.309196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:44.309129 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="tokenizer" containerID="cri-o://ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5" gracePeriod=30 Apr 22 14:43:44.342704 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:44.342670 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f11ebe1_fff6_4c8d_aef9_19b64b34e99e.slice/crio-conmon-b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770.scope\": RecentStats: unable to find data in memory cache], [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6f11ebe1_fff6_4c8d_aef9_19b64b34e99e.slice/crio-b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:43:44.981654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:44.981609 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" event={"ID":"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc","Type":"ContainerStarted","Data":"06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6"} Apr 22 14:43:44.983497 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:44.983469 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerID="b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770" exitCode=0 Apr 22 14:43:44.983603 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:44.983516 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerDied","Data":"b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770"} Apr 22 14:43:45.001703 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.001639 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" podStartSLOduration=7.001617995 podStartE2EDuration="7.001617995s" podCreationTimestamp="2026-04-22 14:43:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:43:44.999674501 +0000 UTC m=+1725.674477765" watchObservedRunningTime="2026-04-22 14:43:45.001617995 +0000 UTC m=+1725.676421274" Apr 22 14:43:45.656546 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.656521 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:43:45.670544 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670516 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-uds\") pod \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " Apr 22 14:43:45.670686 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670562 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-xm9sh\" (UniqueName: \"kubernetes.io/projected/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kube-api-access-xm9sh\") pod \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " Apr 22 14:43:45.670686 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670628 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tls-certs\") pod \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " Apr 22 14:43:45.670686 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670651 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-tmp\") pod \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " Apr 22 14:43:45.670849 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670686 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kserve-provision-location\") pod \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " Apr 22 14:43:45.670849 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670726 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-cache\") pod \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\" (UID: \"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e\") " Apr 22 14:43:45.670849 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670764 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" (UID: "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:45.670989 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.670955 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:45.671054 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.671033 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" (UID: "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:45.671116 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.671050 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" (UID: "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:45.671362 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.671340 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" (UID: "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:43:45.672812 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.672754 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kube-api-access-xm9sh" (OuterVolumeSpecName: "kube-api-access-xm9sh") pod "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" (UID: "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e"). InnerVolumeSpecName "kube-api-access-xm9sh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:43:45.672812 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.672775 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" (UID: "6f11ebe1-fff6-4c8d-aef9-19b64b34e99e"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:43:45.771843 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.771805 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:45.771843 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.771836 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:45.771843 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.771845 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:45.772079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.771860 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:45.772079 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.771873 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-xm9sh\" (UniqueName: \"kubernetes.io/projected/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e-kube-api-access-xm9sh\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:43:45.988582 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.988545 2578 generic.go:358] "Generic (PLEG): container finished" podID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerID="ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5" exitCode=0 Apr 22 14:43:45.988960 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.988626 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerDied","Data":"ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5"} Apr 22 14:43:45.988960 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.988635 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" Apr 22 14:43:45.988960 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.988673 2578 scope.go:117] "RemoveContainer" containerID="ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5" Apr 22 14:43:45.988960 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.988662 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr" event={"ID":"6f11ebe1-fff6-4c8d-aef9-19b64b34e99e","Type":"ContainerDied","Data":"41dc2f21711f6fb2b3c54edd38c5a1bf2c1e1fcdbf2175b2e5fcac7e40b37588"} Apr 22 14:43:45.996992 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:45.996972 2578 scope.go:117] "RemoveContainer" containerID="b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770" Apr 22 14:43:46.004434 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.004416 2578 scope.go:117] "RemoveContainer" containerID="24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c" Apr 22 14:43:46.008233 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.008211 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr"] Apr 22 14:43:46.012131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.012113 2578 scope.go:117] "RemoveContainer" containerID="ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5" Apr 22 14:43:46.012431 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:46.012409 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5\": container with ID starting with ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5 not found: ID does not exist" containerID="ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5" Apr 22 14:43:46.012505 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.012438 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5"} err="failed to get container status \"ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5\": rpc error: code = NotFound desc = could not find container \"ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5\": container with ID starting with ebd47f4e37961255bd0371dfc24d4dcb6742d0fbee7b43b27a0b0c1e4437b0a5 not found: ID does not exist" Apr 22 14:43:46.012505 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.012457 2578 scope.go:117] "RemoveContainer" containerID="b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770" Apr 22 14:43:46.012726 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:46.012704 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770\": container with ID starting with b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770 not found: ID does not exist" containerID="b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770" Apr 22 14:43:46.012799 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.012738 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770"} err="failed to get container status \"b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770\": rpc error: code = NotFound desc = could not find container \"b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770\": container with ID starting with b9c597c139e1936cff1f2729044d1dad3f7bcbce58208d23fcac6ba5e44d5770 not found: ID does not exist" Apr 22 14:43:46.012799 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.012764 2578 scope.go:117] "RemoveContainer" containerID="24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c" Apr 22 14:43:46.013023 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:43:46.012999 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c\": container with ID starting with 24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c not found: ID does not exist" containerID="24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c" Apr 22 14:43:46.013125 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.013034 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c"} err="failed to get container status \"24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c\": rpc error: code = NotFound desc = could not find container \"24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c\": container with ID starting with 24969524af9f308c8bd25f988ff3686987bcfe98b05a2a5abcc488d989738f7c not found: ID does not exist" Apr 22 14:43:46.013687 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:46.013668 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-router-scheduler-6cbc49t9dr"] Apr 22 14:43:47.952245 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:47.952187 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" path="/var/lib/kubelet/pods/6f11ebe1-fff6-4c8d-aef9-19b64b34e99e/volumes" Apr 22 14:43:48.770685 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:48.770642 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:48.770996 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:48.770698 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:48.782825 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:48.782793 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:43:49.017883 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:43:49.017854 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:44:08.301895 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.301862 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr"] Apr 22 14:44:08.302268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302197 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="main" Apr 22 14:44:08.302268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302208 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="main" Apr 22 14:44:08.302268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302226 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="storage-initializer" Apr 22 14:44:08.302268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302232 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="storage-initializer" Apr 22 14:44:08.302268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302244 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="tokenizer" Apr 22 14:44:08.302268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302250 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="tokenizer" Apr 22 14:44:08.302492 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302331 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="tokenizer" Apr 22 14:44:08.302492 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.302341 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="6f11ebe1-fff6-4c8d-aef9-19b64b34e99e" containerName="main" Apr 22 14:44:08.307647 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.307627 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.310444 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.310425 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-self-signed-certs\"" Apr 22 14:44:08.310521 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.310500 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-kserve-dockercfg-czzsv\"" Apr 22 14:44:08.317653 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.317627 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr"] Apr 22 14:44:08.325240 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.325214 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl"] Apr 22 14:44:08.328736 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.328719 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.340188 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.340162 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl"] Apr 22 14:44:08.356794 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.356765 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.356939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.356808 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acc96f71-652c-466a-b7dc-12ddf07b951a-tls-certs\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.356939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.356854 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-home\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.356939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.356900 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-dshm\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.356939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.356917 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-l99gm\" (UniqueName: \"kubernetes.io/projected/acc96f71-652c-466a-b7dc-12ddf07b951a-kube-api-access-l99gm\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.357078 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.357013 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-model-cache\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.457750 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457713 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acc96f71-652c-466a-b7dc-12ddf07b951a-tls-certs\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.457750 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457755 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-home\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.457988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457778 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.457988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457812 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.457988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457838 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-home\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.457988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457867 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-dshm\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.457988 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.457977 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edd21b84-194f-43ee-b019-8a392b6e1029-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.458225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458015 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-l99gm\" (UniqueName: \"kubernetes.io/projected/acc96f71-652c-466a-b7dc-12ddf07b951a-kube-api-access-l99gm\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.458225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458052 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-72wfr\" (UniqueName: \"kubernetes.io/projected/edd21b84-194f-43ee-b019-8a392b6e1029-kube-api-access-72wfr\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.458225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458092 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-model-cache\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.458225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458125 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.458225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458175 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.458225 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458185 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-home\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.458487 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458404 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-model-cache\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.458487 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.458449 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.460169 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.460145 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-dshm\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.460490 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.460470 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acc96f71-652c-466a-b7dc-12ddf07b951a-tls-certs\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.467817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.467788 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-l99gm\" (UniqueName: \"kubernetes.io/projected/acc96f71-652c-466a-b7dc-12ddf07b951a-kube-api-access-l99gm\") pod \"router-with-refs-pd-test-kserve-64bf496d54-bxxzr\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.558926 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.558828 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.558926 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.558875 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.558967 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-home\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.558994 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edd21b84-194f-43ee-b019-8a392b6e1029-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.559016 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-72wfr\" (UniqueName: \"kubernetes.io/projected/edd21b84-194f-43ee-b019-8a392b6e1029-kube-api-access-72wfr\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559164 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.559043 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559472 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.559437 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-home\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559584 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.559471 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.559584 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.559488 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-model-cache\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.561267 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.561245 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-dshm\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.561572 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.561552 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edd21b84-194f-43ee-b019-8a392b6e1029-tls-certs\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.567334 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.567294 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-72wfr\" (UniqueName: \"kubernetes.io/projected/edd21b84-194f-43ee-b019-8a392b6e1029-kube-api-access-72wfr\") pod \"router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.618460 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.618422 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:08.641952 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.641918 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:08.770723 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.770679 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k"] Apr 22 14:44:08.776138 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:44:08.776092 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc96f71_652c_466a_b7dc_12ddf07b951a.slice/crio-8663d59dbb20614005d5a41a8851b153a2edf3406abe9cc1ffa74aa1074e5eba WatchSource:0}: Error finding container 8663d59dbb20614005d5a41a8851b153a2edf3406abe9cc1ffa74aa1074e5eba: Status 404 returned error can't find the container with id 8663d59dbb20614005d5a41a8851b153a2edf3406abe9cc1ffa74aa1074e5eba Apr 22 14:44:08.777575 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.777548 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr"] Apr 22 14:44:08.777719 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.777583 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k"] Apr 22 14:44:08.777843 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.777719 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.781840 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.781815 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-pd-test-epp-sa-dockercfg-gdfxk\"" Apr 22 14:44:08.784710 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.784685 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl"] Apr 22 14:44:08.786270 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:44:08.786246 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podedd21b84_194f_43ee_b019_8a392b6e1029.slice/crio-d935d018269c1317d139534a66b1c722bf703d4802d017dbabad7bb93f4e4f01 WatchSource:0}: Error finding container d935d018269c1317d139534a66b1c722bf703d4802d017dbabad7bb93f4e4f01: Status 404 returned error can't find the container with id d935d018269c1317d139534a66b1c722bf703d4802d017dbabad7bb93f4e4f01 Apr 22 14:44:08.861992 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.861948 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.862144 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.862012 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.862144 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.862046 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.862144 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.862068 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kgp8b\" (UniqueName: \"kubernetes.io/projected/9fd74841-a4b1-4c4d-ac58-01005c578a43-kube-api-access-kgp8b\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.862268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.862141 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.862268 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.862177 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd74841-a4b1-4c4d-ac58-01005c578a43-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963105 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963067 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963113 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd74841-a4b1-4c4d-ac58-01005c578a43-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963166 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963195 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963235 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963290 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963259 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kgp8b\" (UniqueName: \"kubernetes.io/projected/9fd74841-a4b1-4c4d-ac58-01005c578a43-kube-api-access-kgp8b\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963594 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963570 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-kserve-provision-location\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963617 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-uds\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963712 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963661 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-tmp\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.963762 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.963700 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-cache\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.965659 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.965643 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd74841-a4b1-4c4d-ac58-01005c578a43-tls-certs\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:08.971270 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:08.971253 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kgp8b\" (UniqueName: \"kubernetes.io/projected/9fd74841-a4b1-4c4d-ac58-01005c578a43-kube-api-access-kgp8b\") pod \"router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:09.077353 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:09.077215 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" event={"ID":"edd21b84-194f-43ee-b019-8a392b6e1029","Type":"ContainerStarted","Data":"7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050"} Apr 22 14:44:09.077353 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:09.077263 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" event={"ID":"edd21b84-194f-43ee-b019-8a392b6e1029","Type":"ContainerStarted","Data":"d935d018269c1317d139534a66b1c722bf703d4802d017dbabad7bb93f4e4f01"} Apr 22 14:44:09.078458 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:09.078426 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerStarted","Data":"8663d59dbb20614005d5a41a8851b153a2edf3406abe9cc1ffa74aa1074e5eba"} Apr 22 14:44:09.091636 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:09.091608 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:09.224140 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:09.224113 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k"] Apr 22 14:44:09.226127 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:44:09.226090 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9fd74841_a4b1_4c4d_ac58_01005c578a43.slice/crio-d83e4d1c396c5e01fc71ff6931051bf8c3f9dc375db62adb3278a6ca7fa96525 WatchSource:0}: Error finding container d83e4d1c396c5e01fc71ff6931051bf8c3f9dc375db62adb3278a6ca7fa96525: Status 404 returned error can't find the container with id d83e4d1c396c5e01fc71ff6931051bf8c3f9dc375db62adb3278a6ca7fa96525 Apr 22 14:44:10.084202 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:10.084152 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerStarted","Data":"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e"} Apr 22 14:44:10.084663 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:10.084275 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:10.085930 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:10.085849 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerStarted","Data":"32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376"} Apr 22 14:44:10.086058 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:10.085995 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerStarted","Data":"d83e4d1c396c5e01fc71ff6931051bf8c3f9dc375db62adb3278a6ca7fa96525"} Apr 22 14:44:11.091717 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:11.091601 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerStarted","Data":"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e"} Apr 22 14:44:11.093899 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:11.093873 2578 generic.go:358] "Generic (PLEG): container finished" podID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerID="32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376" exitCode=0 Apr 22 14:44:11.094030 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:11.093962 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerDied","Data":"32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376"} Apr 22 14:44:12.098968 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:12.098929 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerStarted","Data":"daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156"} Apr 22 14:44:12.099378 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:12.098978 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerStarted","Data":"ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154"} Apr 22 14:44:12.099378 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:12.099287 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:12.119647 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:12.119587 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" podStartSLOduration=4.11957045 podStartE2EDuration="4.11957045s" podCreationTimestamp="2026-04-22 14:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:44:12.11757279 +0000 UTC m=+1752.792376054" watchObservedRunningTime="2026-04-22 14:44:12.11957045 +0000 UTC m=+1752.794373715" Apr 22 14:44:14.119022 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:14.118982 2578 generic.go:358] "Generic (PLEG): container finished" podID="edd21b84-194f-43ee-b019-8a392b6e1029" containerID="7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050" exitCode=0 Apr 22 14:44:14.119444 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:14.119048 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" event={"ID":"edd21b84-194f-43ee-b019-8a392b6e1029","Type":"ContainerDied","Data":"7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050"} Apr 22 14:44:14.442502 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:44:14.442467 2578 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podacc96f71_652c_466a_b7dc_12ddf07b951a.slice/crio-conmon-6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e.scope\": RecentStats: unable to find data in memory cache]" Apr 22 14:44:15.125996 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:15.125951 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" event={"ID":"edd21b84-194f-43ee-b019-8a392b6e1029","Type":"ContainerStarted","Data":"c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565"} Apr 22 14:44:15.127661 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:15.127629 2578 generic.go:358] "Generic (PLEG): container finished" podID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerID="6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e" exitCode=0 Apr 22 14:44:15.127893 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:15.127669 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerDied","Data":"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e"} Apr 22 14:44:15.150517 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:15.150457 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podStartSLOduration=7.150437745 podStartE2EDuration="7.150437745s" podCreationTimestamp="2026-04-22 14:44:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:44:15.148080174 +0000 UTC m=+1755.822883432" watchObservedRunningTime="2026-04-22 14:44:15.150437745 +0000 UTC m=+1755.825241011" Apr 22 14:44:16.134147 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:16.134107 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerStarted","Data":"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c"} Apr 22 14:44:16.160837 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:16.160746 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podStartSLOduration=7.102892151 podStartE2EDuration="8.160723833s" podCreationTimestamp="2026-04-22 14:44:08 +0000 UTC" firstStartedPulling="2026-04-22 14:44:08.778513838 +0000 UTC m=+1749.453317082" lastFinishedPulling="2026-04-22 14:44:09.836345518 +0000 UTC m=+1750.511148764" observedRunningTime="2026-04-22 14:44:16.158034928 +0000 UTC m=+1756.832838197" watchObservedRunningTime="2026-04-22 14:44:16.160723833 +0000 UTC m=+1756.835527100" Apr 22 14:44:18.619312 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:18.619253 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:18.619312 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:18.619322 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:18.621036 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:18.620995 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:44:18.642725 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:18.642684 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:18.642725 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:18.642731 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:44:18.644532 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:18.644487 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:44:19.092136 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:19.092092 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:19.092329 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:19.092152 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:19.093795 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:19.093757 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="tokenizer" probeResult="failure" output="Get \"http://10.134.0.49:8082/healthz\": dial tcp 10.134.0.49:8082: connect: connection refused" Apr 22 14:44:22.104735 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.104695 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd"] Apr 22 14:44:22.105361 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.105000 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerName="main" containerID="cri-o://06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6" gracePeriod=30 Apr 22 14:44:22.398539 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.398512 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:44:22.515847 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.515799 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-thhjs\" (UniqueName: \"kubernetes.io/projected/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kube-api-access-thhjs\") pod \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " Apr 22 14:44:22.516040 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.515938 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-dshm\") pod \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " Apr 22 14:44:22.516040 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.515970 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-home\") pod \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " Apr 22 14:44:22.516040 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.515999 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-model-cache\") pod \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " Apr 22 14:44:22.516040 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.516024 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-tls-certs\") pod \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " Apr 22 14:44:22.516269 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.516064 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kserve-provision-location\") pod \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\" (UID: \"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc\") " Apr 22 14:44:22.516269 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.516214 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-home" (OuterVolumeSpecName: "home") pod "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" (UID: "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:22.516424 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.516389 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-home\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:44:22.516574 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.516551 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-model-cache" (OuterVolumeSpecName: "model-cache") pod "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" (UID: "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:22.518367 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.518333 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kube-api-access-thhjs" (OuterVolumeSpecName: "kube-api-access-thhjs") pod "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" (UID: "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc"). InnerVolumeSpecName "kube-api-access-thhjs". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:44:22.518545 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.518525 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" (UID: "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:44:22.518651 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.518633 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-dshm" (OuterVolumeSpecName: "dshm") pod "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" (UID: "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:22.599352 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.599276 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" (UID: "9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:44:22.617592 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.617555 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-thhjs\" (UniqueName: \"kubernetes.io/projected/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kube-api-access-thhjs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:44:22.617715 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.617595 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-dshm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:44:22.617715 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.617609 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-model-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:44:22.617715 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.617624 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:44:22.617715 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:22.617637 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:44:23.165567 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.165522 2578 generic.go:358] "Generic (PLEG): container finished" podID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerID="06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6" exitCode=0 Apr 22 14:44:23.166007 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.165580 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" event={"ID":"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc","Type":"ContainerDied","Data":"06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6"} Apr 22 14:44:23.166007 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.165616 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" Apr 22 14:44:23.166007 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.165633 2578 scope.go:117] "RemoveContainer" containerID="06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6" Apr 22 14:44:23.166007 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.165618 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd" event={"ID":"9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc","Type":"ContainerDied","Data":"01bbf4c537ab4827c0b108b4766fbdefb6a0a107ef30d9b54183a6f38803eba1"} Apr 22 14:44:23.176376 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.176350 2578 scope.go:117] "RemoveContainer" containerID="e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404" Apr 22 14:44:23.188815 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.188786 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd"] Apr 22 14:44:23.191129 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.191104 2578 scope.go:117] "RemoveContainer" containerID="06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6" Apr 22 14:44:23.191624 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:44:23.191588 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6\": container with ID starting with 06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6 not found: ID does not exist" containerID="06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6" Apr 22 14:44:23.191721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.191637 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6"} err="failed to get container status \"06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6\": rpc error: code = NotFound desc = could not find container \"06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6\": container with ID starting with 06cb0385fd7882aff01c867a7e848e49b606c7acf05cb7514cfd6bf2ce0809b6 not found: ID does not exist" Apr 22 14:44:23.191721 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.191665 2578 scope.go:117] "RemoveContainer" containerID="e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404" Apr 22 14:44:23.192128 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:44:23.191981 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404\": container with ID starting with e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404 not found: ID does not exist" containerID="e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404" Apr 22 14:44:23.192128 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.192015 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404"} err="failed to get container status \"e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404\": rpc error: code = NotFound desc = could not find container \"e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404\": container with ID starting with e0847dcd629e1d0cc246f0d0cf84a868eeb635f5b231d2c6a49aabd053a01404 not found: ID does not exist" Apr 22 14:44:23.193001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.192980 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-5dd6f9b7ff-qrzrd"] Apr 22 14:44:23.953383 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:23.953345 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" path="/var/lib/kubelet/pods/9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc/volumes" Apr 22 14:44:28.619915 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:28.619821 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:44:28.632157 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:28.632131 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:44:28.642399 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:28.642360 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:44:29.093847 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:29.093814 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:29.095190 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:29.095166 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:38.619841 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:38.619777 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:44:38.642866 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:38.642818 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:44:48.619670 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:48.619613 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:44:48.642542 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:48.642497 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:44:49.192066 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:49.192031 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:44:58.619578 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:58.619509 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:44:58.642467 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:58.642417 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:44:59.968790 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:59.968760 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:44:59.976240 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:44:59.976215 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:45:08.619866 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:08.619813 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:45:08.642658 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:08.642595 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:45:18.618878 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:18.618827 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:45:18.642654 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:18.642609 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:45:28.619383 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:28.619323 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:45:28.642493 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:28.642446 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:45:38.619695 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:38.619642 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:45:38.642509 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:38.642463 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:45:48.619763 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:48.619702 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:45:48.643245 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:48.643205 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:45:58.619182 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:58.619091 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:45:58.642613 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:45:58.642570 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:46:08.619811 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:08.619761 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:46:08.642450 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:08.642407 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:46:18.619358 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:18.619248 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:46:18.642333 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:18.642272 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:46:28.619236 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:28.619180 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8001/health\": dial tcp 10.134.0.47:8001: connect: connection refused" Apr 22 14:46:28.642768 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:28.642720 2578 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 22 14:46:38.629080 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:38.629043 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:46:38.641763 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:38.641727 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:46:38.652183 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:38.652153 2578 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:46:38.660801 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:38.660745 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:46:50.139192 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.139155 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr"] Apr 22 14:46:50.140575 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.140511 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" containerID="cri-o://3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c" gracePeriod=30 Apr 22 14:46:50.142105 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.142056 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl"] Apr 22 14:46:50.142506 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.142478 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" containerID="cri-o://c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565" gracePeriod=30 Apr 22 14:46:50.144117 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.144074 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k"] Apr 22 14:46:50.144508 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.144481 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="main" containerID="cri-o://ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154" gracePeriod=30 Apr 22 14:46:50.144656 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.144638 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="tokenizer" containerID="cri-o://daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156" gracePeriod=30 Apr 22 14:46:50.733576 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.733540 2578 generic.go:358] "Generic (PLEG): container finished" podID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerID="ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154" exitCode=0 Apr 22 14:46:50.733780 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:50.733588 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerDied","Data":"ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154"} Apr 22 14:46:51.490406 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.490378 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:46:51.612898 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.612803 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-kserve-provision-location\") pod \"9fd74841-a4b1-4c4d-ac58-01005c578a43\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " Apr 22 14:46:51.612898 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.612875 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-tmp\") pod \"9fd74841-a4b1-4c4d-ac58-01005c578a43\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " Apr 22 14:46:51.613150 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.612913 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kgp8b\" (UniqueName: \"kubernetes.io/projected/9fd74841-a4b1-4c4d-ac58-01005c578a43-kube-api-access-kgp8b\") pod \"9fd74841-a4b1-4c4d-ac58-01005c578a43\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " Apr 22 14:46:51.613150 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.612935 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-cache\") pod \"9fd74841-a4b1-4c4d-ac58-01005c578a43\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " Apr 22 14:46:51.613150 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.612953 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd74841-a4b1-4c4d-ac58-01005c578a43-tls-certs\") pod \"9fd74841-a4b1-4c4d-ac58-01005c578a43\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " Apr 22 14:46:51.613150 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.612983 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-uds\") pod \"9fd74841-a4b1-4c4d-ac58-01005c578a43\" (UID: \"9fd74841-a4b1-4c4d-ac58-01005c578a43\") " Apr 22 14:46:51.613389 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.613266 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "9fd74841-a4b1-4c4d-ac58-01005c578a43" (UID: "9fd74841-a4b1-4c4d-ac58-01005c578a43"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:51.613389 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.613250 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "9fd74841-a4b1-4c4d-ac58-01005c578a43" (UID: "9fd74841-a4b1-4c4d-ac58-01005c578a43"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:51.613389 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.613341 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "9fd74841-a4b1-4c4d-ac58-01005c578a43" (UID: "9fd74841-a4b1-4c4d-ac58-01005c578a43"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:51.613585 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.613559 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9fd74841-a4b1-4c4d-ac58-01005c578a43" (UID: "9fd74841-a4b1-4c4d-ac58-01005c578a43"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:46:51.615138 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.615116 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9fd74841-a4b1-4c4d-ac58-01005c578a43-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9fd74841-a4b1-4c4d-ac58-01005c578a43" (UID: "9fd74841-a4b1-4c4d-ac58-01005c578a43"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:46:51.615226 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.615141 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9fd74841-a4b1-4c4d-ac58-01005c578a43-kube-api-access-kgp8b" (OuterVolumeSpecName: "kube-api-access-kgp8b") pod "9fd74841-a4b1-4c4d-ac58-01005c578a43" (UID: "9fd74841-a4b1-4c4d-ac58-01005c578a43"). InnerVolumeSpecName "kube-api-access-kgp8b". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:46:51.713775 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.713737 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-tmp\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.713775 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.713769 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kgp8b\" (UniqueName: \"kubernetes.io/projected/9fd74841-a4b1-4c4d-ac58-01005c578a43-kube-api-access-kgp8b\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.713775 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.713780 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.714016 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.713790 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9fd74841-a4b1-4c4d-ac58-01005c578a43-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.714016 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.713799 2578 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-tokenizer-uds\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.714016 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.713808 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9fd74841-a4b1-4c4d-ac58-01005c578a43-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:46:51.739126 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.739090 2578 generic.go:358] "Generic (PLEG): container finished" podID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerID="daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156" exitCode=0 Apr 22 14:46:51.739263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.739156 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerDied","Data":"daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156"} Apr 22 14:46:51.739263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.739162 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" Apr 22 14:46:51.739263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.739182 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k" event={"ID":"9fd74841-a4b1-4c4d-ac58-01005c578a43","Type":"ContainerDied","Data":"d83e4d1c396c5e01fc71ff6931051bf8c3f9dc375db62adb3278a6ca7fa96525"} Apr 22 14:46:51.739263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.739197 2578 scope.go:117] "RemoveContainer" containerID="daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156" Apr 22 14:46:51.748214 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.748194 2578 scope.go:117] "RemoveContainer" containerID="ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154" Apr 22 14:46:51.755808 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.755782 2578 scope.go:117] "RemoveContainer" containerID="32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376" Apr 22 14:46:51.763828 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.763805 2578 scope.go:117] "RemoveContainer" containerID="daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156" Apr 22 14:46:51.764193 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:46:51.764159 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156\": container with ID starting with daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156 not found: ID does not exist" containerID="daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156" Apr 22 14:46:51.764327 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.764202 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156"} err="failed to get container status \"daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156\": rpc error: code = NotFound desc = could not find container \"daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156\": container with ID starting with daf9abcdfb64eb1fafc2103ffd06b365d16d2bef4f459fc6d7279f1767110156 not found: ID does not exist" Apr 22 14:46:51.764327 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.764228 2578 scope.go:117] "RemoveContainer" containerID="ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154" Apr 22 14:46:51.764567 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:46:51.764531 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154\": container with ID starting with ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154 not found: ID does not exist" containerID="ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154" Apr 22 14:46:51.764639 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.764563 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154"} err="failed to get container status \"ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154\": rpc error: code = NotFound desc = could not find container \"ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154\": container with ID starting with ae34970d8f3621399effa032017986c5e7ba0e440c760a1d47d88c637632a154 not found: ID does not exist" Apr 22 14:46:51.764639 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.764582 2578 scope.go:117] "RemoveContainer" containerID="32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376" Apr 22 14:46:51.765681 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:46:51.765192 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376\": container with ID starting with 32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376 not found: ID does not exist" containerID="32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376" Apr 22 14:46:51.765681 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.765279 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376"} err="failed to get container status \"32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376\": rpc error: code = NotFound desc = could not find container \"32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376\": container with ID starting with 32d37ad7a03990ea77baaa835f183e7c66af5a052a63616409de3dc59608e376 not found: ID does not exist" Apr 22 14:46:51.768001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.767079 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k"] Apr 22 14:46:51.771440 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.771419 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-router-scheduler-6ccc9d84d6rr8k"] Apr 22 14:46:51.951767 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:46:51.951726 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" path="/var/lib/kubelet/pods/9fd74841-a4b1-4c4d-ac58-01005c578a43/volumes" Apr 22 14:47:07.159242 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:07.159202 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:07.182366 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:07.182338 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:07.198553 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:07.198523 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:07.219542 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:07.219507 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:07.227428 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:07.227397 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:08.261430 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:08.261394 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:08.270923 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:08.270896 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:08.283076 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:08.283044 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:08.307367 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:08.307290 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:08.316662 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:08.316639 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:09.397743 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:09.397710 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:09.411545 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:09.411512 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:09.424855 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:09.424822 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:09.447490 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:09.447449 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:09.455910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:09.455885 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:10.480605 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:10.480573 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:10.491817 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:10.491786 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:10.503933 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:10.503890 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:10.524694 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:10.524664 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:10.532115 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:10.532080 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:11.532673 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:11.532637 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:11.539705 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:11.539674 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:11.551168 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:11.551137 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:11.573534 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:11.573499 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:11.582573 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:11.582519 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:12.585773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:12.585739 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:12.592926 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:12.592900 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:12.603959 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:12.603919 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:12.624150 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:12.624115 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:12.631755 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:12.631722 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:13.631232 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:13.631198 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:13.638729 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:13.638692 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:13.653306 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:13.653277 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:13.712048 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:13.712018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:13.721498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:13.721463 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:14.720185 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:14.720152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:14.731591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:14.731567 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:14.748641 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:14.748608 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:14.773235 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:14.773204 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:14.803308 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:14.803263 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:15.814102 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:15.814027 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:15.821259 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:15.821219 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:15.836540 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:15.836512 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:15.862504 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:15.862461 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:15.873131 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:15.873094 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:16.870128 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:16.870098 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:16.879556 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:16.879531 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:16.890497 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:16.890466 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:16.909655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:16.909625 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:16.921254 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:16.921228 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:17.955882 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:17.955851 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:17.964229 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:17.964195 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:17.976863 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:17.976831 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:17.998665 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:17.998630 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:18.010678 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:18.010626 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:19.048834 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:19.048806 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:19.059205 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:19.059171 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:19.070814 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:19.070786 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:19.093634 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:19.093602 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:19.103412 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:19.103381 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:20.077715 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.077680 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:20.084867 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.084838 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/llm-d-routing-sidecar/0.log" Apr 22 14:47:20.095088 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.095048 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/storage-initializer/0.log" Apr 22 14:47:20.117785 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.117760 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/main/0.log" Apr 22 14:47:20.124986 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.124956 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl_edd21b84-194f-43ee-b019-8a392b6e1029/storage-initializer/0.log" Apr 22 14:47:20.140932 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.140887 2578 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="llm-d-routing-sidecar" containerID="cri-o://b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e" gracePeriod=2 Apr 22 14:47:20.394433 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.394410 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:20.395103 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.395086 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:47:20.419258 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.419236 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:47:20.449087 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449059 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-model-cache\") pod \"edd21b84-194f-43ee-b019-8a392b6e1029\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449108 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-kserve-provision-location\") pod \"acc96f71-652c-466a-b7dc-12ddf07b951a\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449132 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-72wfr\" (UniqueName: \"kubernetes.io/projected/edd21b84-194f-43ee-b019-8a392b6e1029-kube-api-access-72wfr\") pod \"edd21b84-194f-43ee-b019-8a392b6e1029\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449153 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-home\") pod \"acc96f71-652c-466a-b7dc-12ddf07b951a\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449185 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-dshm\") pod \"edd21b84-194f-43ee-b019-8a392b6e1029\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449204 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-home\") pod \"edd21b84-194f-43ee-b019-8a392b6e1029\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449225 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-dshm\") pod \"acc96f71-652c-466a-b7dc-12ddf07b951a\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " Apr 22 14:47:20.449288 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449252 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edd21b84-194f-43ee-b019-8a392b6e1029-tls-certs\") pod \"edd21b84-194f-43ee-b019-8a392b6e1029\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " Apr 22 14:47:20.449697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449339 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acc96f71-652c-466a-b7dc-12ddf07b951a-tls-certs\") pod \"acc96f71-652c-466a-b7dc-12ddf07b951a\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " Apr 22 14:47:20.449697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449371 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-l99gm\" (UniqueName: \"kubernetes.io/projected/acc96f71-652c-466a-b7dc-12ddf07b951a-kube-api-access-l99gm\") pod \"acc96f71-652c-466a-b7dc-12ddf07b951a\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " Apr 22 14:47:20.449697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449396 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-model-cache" (OuterVolumeSpecName: "model-cache") pod "edd21b84-194f-43ee-b019-8a392b6e1029" (UID: "edd21b84-194f-43ee-b019-8a392b6e1029"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.449697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449409 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-model-cache\") pod \"acc96f71-652c-466a-b7dc-12ddf07b951a\" (UID: \"acc96f71-652c-466a-b7dc-12ddf07b951a\") " Apr 22 14:47:20.449697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449467 2578 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-kserve-provision-location\") pod \"edd21b84-194f-43ee-b019-8a392b6e1029\" (UID: \"edd21b84-194f-43ee-b019-8a392b6e1029\") " Apr 22 14:47:20.449697 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449612 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-model-cache" (OuterVolumeSpecName: "model-cache") pod "acc96f71-652c-466a-b7dc-12ddf07b951a" (UID: "acc96f71-652c-466a-b7dc-12ddf07b951a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.450001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449807 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-model-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.450001 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.449830 2578 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-model-cache\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.450106 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.450085 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-home" (OuterVolumeSpecName: "home") pod "acc96f71-652c-466a-b7dc-12ddf07b951a" (UID: "acc96f71-652c-466a-b7dc-12ddf07b951a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.450422 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.450399 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-home" (OuterVolumeSpecName: "home") pod "edd21b84-194f-43ee-b019-8a392b6e1029" (UID: "edd21b84-194f-43ee-b019-8a392b6e1029"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.452244 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.452120 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/acc96f71-652c-466a-b7dc-12ddf07b951a-kube-api-access-l99gm" (OuterVolumeSpecName: "kube-api-access-l99gm") pod "acc96f71-652c-466a-b7dc-12ddf07b951a" (UID: "acc96f71-652c-466a-b7dc-12ddf07b951a"). InnerVolumeSpecName "kube-api-access-l99gm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:47:20.452657 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.452616 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/acc96f71-652c-466a-b7dc-12ddf07b951a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "acc96f71-652c-466a-b7dc-12ddf07b951a" (UID: "acc96f71-652c-466a-b7dc-12ddf07b951a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:47:20.452783 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.452755 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/edd21b84-194f-43ee-b019-8a392b6e1029-kube-api-access-72wfr" (OuterVolumeSpecName: "kube-api-access-72wfr") pod "edd21b84-194f-43ee-b019-8a392b6e1029" (UID: "edd21b84-194f-43ee-b019-8a392b6e1029"). InnerVolumeSpecName "kube-api-access-72wfr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 22 14:47:20.452783 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.452765 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-dshm" (OuterVolumeSpecName: "dshm") pod "edd21b84-194f-43ee-b019-8a392b6e1029" (UID: "edd21b84-194f-43ee-b019-8a392b6e1029"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.453245 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.453221 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-dshm" (OuterVolumeSpecName: "dshm") pod "acc96f71-652c-466a-b7dc-12ddf07b951a" (UID: "acc96f71-652c-466a-b7dc-12ddf07b951a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.453939 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.453913 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/edd21b84-194f-43ee-b019-8a392b6e1029-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "edd21b84-194f-43ee-b019-8a392b6e1029" (UID: "edd21b84-194f-43ee-b019-8a392b6e1029"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 22 14:47:20.506778 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.506717 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "acc96f71-652c-466a-b7dc-12ddf07b951a" (UID: "acc96f71-652c-466a-b7dc-12ddf07b951a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.512655 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.512622 2578 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "edd21b84-194f-43ee-b019-8a392b6e1029" (UID: "edd21b84-194f-43ee-b019-8a392b6e1029"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 22 14:47:20.550566 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550534 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-dshm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550566 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550564 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-home\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550566 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550575 2578 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-dshm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550584 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/edd21b84-194f-43ee-b019-8a392b6e1029-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550594 2578 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/acc96f71-652c-466a-b7dc-12ddf07b951a-tls-certs\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550604 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-l99gm\" (UniqueName: \"kubernetes.io/projected/acc96f71-652c-466a-b7dc-12ddf07b951a-kube-api-access-l99gm\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550614 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/edd21b84-194f-43ee-b019-8a392b6e1029-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550623 2578 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-kserve-provision-location\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550632 2578 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-72wfr\" (UniqueName: \"kubernetes.io/projected/edd21b84-194f-43ee-b019-8a392b6e1029-kube-api-access-72wfr\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.550759 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.550641 2578 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/acc96f71-652c-466a-b7dc-12ddf07b951a-home\") on node \"ip-10-0-132-130.ec2.internal\" DevicePath \"\"" Apr 22 14:47:20.846684 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.846650 2578 generic.go:358] "Generic (PLEG): container finished" podID="edd21b84-194f-43ee-b019-8a392b6e1029" containerID="c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565" exitCode=137 Apr 22 14:47:20.846910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.846726 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" Apr 22 14:47:20.846910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.846726 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" event={"ID":"edd21b84-194f-43ee-b019-8a392b6e1029","Type":"ContainerDied","Data":"c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565"} Apr 22 14:47:20.846910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.846773 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl" event={"ID":"edd21b84-194f-43ee-b019-8a392b6e1029","Type":"ContainerDied","Data":"d935d018269c1317d139534a66b1c722bf703d4802d017dbabad7bb93f4e4f01"} Apr 22 14:47:20.846910 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.846794 2578 scope.go:117] "RemoveContainer" containerID="c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565" Apr 22 14:47:20.848078 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848062 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-pd-test-kserve-64bf496d54-bxxzr_acc96f71-652c-466a-b7dc-12ddf07b951a/main/0.log" Apr 22 14:47:20.848758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848733 2578 generic.go:358] "Generic (PLEG): container finished" podID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerID="3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c" exitCode=137 Apr 22 14:47:20.848758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848758 2578 generic.go:358] "Generic (PLEG): container finished" podID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerID="b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e" exitCode=0 Apr 22 14:47:20.848905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848802 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerDied","Data":"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c"} Apr 22 14:47:20.848905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848814 2578 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" Apr 22 14:47:20.848905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848828 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerDied","Data":"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e"} Apr 22 14:47:20.848905 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.848844 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr" event={"ID":"acc96f71-652c-466a-b7dc-12ddf07b951a","Type":"ContainerDied","Data":"8663d59dbb20614005d5a41a8851b153a2edf3406abe9cc1ffa74aa1074e5eba"} Apr 22 14:47:20.871177 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.871141 2578 scope.go:117] "RemoveContainer" containerID="7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050" Apr 22 14:47:20.874247 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.874219 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl"] Apr 22 14:47:20.877966 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.877940 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-prefill-5859b66879-9wgcl"] Apr 22 14:47:20.890730 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.890700 2578 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr"] Apr 22 14:47:20.895280 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.895252 2578 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-pd-test-kserve-64bf496d54-bxxzr"] Apr 22 14:47:20.935361 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.935339 2578 scope.go:117] "RemoveContainer" containerID="c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565" Apr 22 14:47:20.935710 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:47:20.935692 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565\": container with ID starting with c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565 not found: ID does not exist" containerID="c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565" Apr 22 14:47:20.935762 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.935719 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565"} err="failed to get container status \"c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565\": rpc error: code = NotFound desc = could not find container \"c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565\": container with ID starting with c247fd57401347265812d94ac38b4ae537ae0a8fecb6cc41f859b2d11514e565 not found: ID does not exist" Apr 22 14:47:20.935762 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.935742 2578 scope.go:117] "RemoveContainer" containerID="7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050" Apr 22 14:47:20.935986 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:47:20.935970 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050\": container with ID starting with 7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050 not found: ID does not exist" containerID="7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050" Apr 22 14:47:20.936031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.935989 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050"} err="failed to get container status \"7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050\": rpc error: code = NotFound desc = could not find container \"7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050\": container with ID starting with 7b34dee3d71dd5a2396378b85d0aa0bae536bb44ae0359e92aa1799ce2abb050 not found: ID does not exist" Apr 22 14:47:20.936031 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.936003 2578 scope.go:117] "RemoveContainer" containerID="3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c" Apr 22 14:47:20.957138 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:20.957114 2578 scope.go:117] "RemoveContainer" containerID="6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e" Apr 22 14:47:21.016369 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.016342 2578 scope.go:117] "RemoveContainer" containerID="b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e" Apr 22 14:47:21.024165 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.024145 2578 scope.go:117] "RemoveContainer" containerID="3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c" Apr 22 14:47:21.024479 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:47:21.024458 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c\": container with ID starting with 3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c not found: ID does not exist" containerID="3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c" Apr 22 14:47:21.024534 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.024492 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c"} err="failed to get container status \"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c\": rpc error: code = NotFound desc = could not find container \"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c\": container with ID starting with 3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c not found: ID does not exist" Apr 22 14:47:21.024534 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.024512 2578 scope.go:117] "RemoveContainer" containerID="6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e" Apr 22 14:47:21.024766 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:47:21.024752 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e\": container with ID starting with 6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e not found: ID does not exist" containerID="6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e" Apr 22 14:47:21.024813 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.024770 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e"} err="failed to get container status \"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e\": rpc error: code = NotFound desc = could not find container \"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e\": container with ID starting with 6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e not found: ID does not exist" Apr 22 14:47:21.024813 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.024785 2578 scope.go:117] "RemoveContainer" containerID="b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e" Apr 22 14:47:21.024985 ip-10-0-132-130 kubenswrapper[2578]: E0422 14:47:21.024967 2578 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e\": container with ID starting with b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e not found: ID does not exist" containerID="b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e" Apr 22 14:47:21.025032 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.024992 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e"} err="failed to get container status \"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e\": rpc error: code = NotFound desc = could not find container \"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e\": container with ID starting with b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e not found: ID does not exist" Apr 22 14:47:21.025032 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.025009 2578 scope.go:117] "RemoveContainer" containerID="3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c" Apr 22 14:47:21.025231 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.025214 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c"} err="failed to get container status \"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c\": rpc error: code = NotFound desc = could not find container \"3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c\": container with ID starting with 3f50ae2ae40ed3ae85969f4c8a5a5c23f7dbabfa7d63f0b21006af61c0b2ee6c not found: ID does not exist" Apr 22 14:47:21.025284 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.025231 2578 scope.go:117] "RemoveContainer" containerID="6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e" Apr 22 14:47:21.025449 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.025427 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e"} err="failed to get container status \"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e\": rpc error: code = NotFound desc = could not find container \"6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e\": container with ID starting with 6d7f4de43085fd0ca21c6a47479886285395da3e6c9b14f2fbcf4ea80e00934e not found: ID does not exist" Apr 22 14:47:21.025530 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.025452 2578 scope.go:117] "RemoveContainer" containerID="b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e" Apr 22 14:47:21.025696 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.025679 2578 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e"} err="failed to get container status \"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e\": rpc error: code = NotFound desc = could not find container \"b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e\": container with ID starting with b356defe73f55233acc390a594ecbfb4077c6f85d27a130069e209cf9f70f36e not found: ID does not exist" Apr 22 14:47:21.952540 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.952506 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" path="/var/lib/kubelet/pods/acc96f71-652c-466a-b7dc-12ddf07b951a/volumes" Apr 22 14:47:21.953020 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:21.953005 2578 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" path="/var/lib/kubelet/pods/edd21b84-194f-43ee-b019-8a392b6e1029/volumes" Apr 22 14:47:22.140011 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:22.139974 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-56d64c8c4-q4z5h_817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6/router/0.log" Apr 22 14:47:23.003916 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:23.003882 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-56d64c8c4-q4z5h_817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6/router/0.log" Apr 22 14:47:23.843866 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:23.843827 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jr972_cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9/authorino/0.log" Apr 22 14:47:23.889875 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:23.889847 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-zmhlt_a69c2dcb-23d5-4a8f-a772-1146cb65f5be/kuadrant-console-plugin/0.log" Apr 22 14:47:26.422664 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.422626 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-54rh7/must-gather-dv6pr"] Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.422968 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.422981 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.422989 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.422994 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423005 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="llm-d-routing-sidecar" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423012 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="llm-d-routing-sidecar" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423021 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423027 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423033 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423039 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423046 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="tokenizer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423051 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="tokenizer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423057 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423063 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423074 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423079 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423086 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423091 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423096 2578 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="storage-initializer" Apr 22 14:47:26.423097 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423101 2578 state_mem.go:107] "Deleted CPUSet assignment" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="storage-initializer" Apr 22 14:47:26.423830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423148 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="edd21b84-194f-43ee-b019-8a392b6e1029" containerName="main" Apr 22 14:47:26.423830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423156 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="llm-d-routing-sidecar" Apr 22 14:47:26.423830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423162 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="tokenizer" Apr 22 14:47:26.423830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423171 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9fd74841-a4b1-4c4d-ac58-01005c578a43" containerName="main" Apr 22 14:47:26.423830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423179 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="9a0950f9-5dd9-4e0a-96bf-1523cc3b0fdc" containerName="main" Apr 22 14:47:26.423830 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.423186 2578 memory_manager.go:356] "RemoveStaleState removing state" podUID="acc96f71-652c-466a-b7dc-12ddf07b951a" containerName="main" Apr 22 14:47:26.427460 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.427442 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.430109 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.430089 2578 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-54rh7\"/\"default-dockercfg-4zg56\"" Apr 22 14:47:26.430226 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.430123 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-54rh7\"/\"openshift-service-ca.crt\"" Apr 22 14:47:26.430226 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.430169 2578 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-54rh7\"/\"kube-root-ca.crt\"" Apr 22 14:47:26.438144 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.438116 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-54rh7/must-gather-dv6pr"] Apr 22 14:47:26.495455 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.495414 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a18cdf32-8777-405f-bb47-d59979bf695b-must-gather-output\") pod \"must-gather-dv6pr\" (UID: \"a18cdf32-8777-405f-bb47-d59979bf695b\") " pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.495615 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.495492 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t6lwr\" (UniqueName: \"kubernetes.io/projected/a18cdf32-8777-405f-bb47-d59979bf695b-kube-api-access-t6lwr\") pod \"must-gather-dv6pr\" (UID: \"a18cdf32-8777-405f-bb47-d59979bf695b\") " pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.596263 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.596224 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t6lwr\" (UniqueName: \"kubernetes.io/projected/a18cdf32-8777-405f-bb47-d59979bf695b-kube-api-access-t6lwr\") pod \"must-gather-dv6pr\" (UID: \"a18cdf32-8777-405f-bb47-d59979bf695b\") " pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.596468 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.596415 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a18cdf32-8777-405f-bb47-d59979bf695b-must-gather-output\") pod \"must-gather-dv6pr\" (UID: \"a18cdf32-8777-405f-bb47-d59979bf695b\") " pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.596745 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.596728 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/a18cdf32-8777-405f-bb47-d59979bf695b-must-gather-output\") pod \"must-gather-dv6pr\" (UID: \"a18cdf32-8777-405f-bb47-d59979bf695b\") " pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.605990 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.605961 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t6lwr\" (UniqueName: \"kubernetes.io/projected/a18cdf32-8777-405f-bb47-d59979bf695b-kube-api-access-t6lwr\") pod \"must-gather-dv6pr\" (UID: \"a18cdf32-8777-405f-bb47-d59979bf695b\") " pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.738362 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.738218 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-54rh7/must-gather-dv6pr" Apr 22 14:47:26.872319 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.872110 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-54rh7/must-gather-dv6pr"] Apr 22 14:47:26.874912 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:47:26.874883 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda18cdf32_8777_405f_bb47_d59979bf695b.slice/crio-6eed4999c50263110ab903a361ec5a417914438638e622b7017c394d3628303a WatchSource:0}: Error finding container 6eed4999c50263110ab903a361ec5a417914438638e622b7017c394d3628303a: Status 404 returned error can't find the container with id 6eed4999c50263110ab903a361ec5a417914438638e622b7017c394d3628303a Apr 22 14:47:26.876592 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:26.876575 2578 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 22 14:47:27.876838 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:27.876805 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/must-gather-dv6pr" event={"ID":"a18cdf32-8777-405f-bb47-d59979bf695b","Type":"ContainerStarted","Data":"6eed4999c50263110ab903a361ec5a417914438638e622b7017c394d3628303a"} Apr 22 14:47:28.883495 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:28.883459 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/must-gather-dv6pr" event={"ID":"a18cdf32-8777-405f-bb47-d59979bf695b","Type":"ContainerStarted","Data":"420eedc54a3bcc158bf9331d598d99ae154936f37343a9d4e60fdea1a9515436"} Apr 22 14:47:28.883856 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:28.883504 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/must-gather-dv6pr" event={"ID":"a18cdf32-8777-405f-bb47-d59979bf695b","Type":"ContainerStarted","Data":"185c70bcc1f2506c96019a9c3bcb638ffbbf5f8f3404cdbef9558cf0587dceec"} Apr 22 14:47:28.900413 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:28.900358 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-54rh7/must-gather-dv6pr" podStartSLOduration=1.820037546 podStartE2EDuration="2.900339791s" podCreationTimestamp="2026-04-22 14:47:26 +0000 UTC" firstStartedPulling="2026-04-22 14:47:26.876698908 +0000 UTC m=+1947.551502151" lastFinishedPulling="2026-04-22 14:47:27.957001149 +0000 UTC m=+1948.631804396" observedRunningTime="2026-04-22 14:47:28.898707879 +0000 UTC m=+1949.573511144" watchObservedRunningTime="2026-04-22 14:47:28.900339791 +0000 UTC m=+1949.575143056" Apr 22 14:47:29.608490 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:29.608451 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-hwfss_bd7a4001-4bdb-4908-9ffe-6cd7422ebd4e/global-pull-secret-syncer/0.log" Apr 22 14:47:29.726143 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:29.726111 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-f7m86_215c8235-1207-4971-9cc3-8c7aaa57988c/konnectivity-agent/0.log" Apr 22 14:47:29.805313 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:29.805259 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-132-130.ec2.internal_973f3092b03452d8285b08e0d93dce0b/haproxy/0.log" Apr 22 14:47:33.929051 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:33.929015 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-jr972_cc092e41-b40d-4bb3-bdcf-d2bc2a3cc3c9/authorino/0.log" Apr 22 14:47:34.013151 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:34.013124 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-zmhlt_a69c2dcb-23d5-4a8f-a772-1146cb65f5be/kuadrant-console-plugin/0.log" Apr 22 14:47:35.258258 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.258131 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-4x527_78b16110-0b35-41a7-b840-f66b6fb4ac09/cluster-monitoring-operator/0.log" Apr 22 14:47:35.376995 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.376962 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_monitoring-plugin-7dccd58f55-p4m4h_2cdf65a3-3a9c-48f6-be02-eae076517f5d/monitoring-plugin/0.log" Apr 22 14:47:35.485623 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.485591 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-998dw_0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9/node-exporter/0.log" Apr 22 14:47:35.503160 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.503132 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-998dw_0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9/kube-rbac-proxy/0.log" Apr 22 14:47:35.522199 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.522089 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-998dw_0b47ac76-f656-4ddd-b1b8-4ce0dbaa74d9/init-textfile/0.log" Apr 22 14:47:35.856451 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.856362 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dchw2_763d841d-2a98-4990-ade8-3c083f6c8372/prometheus-operator/0.log" Apr 22 14:47:35.880919 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.880875 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-dchw2_763d841d-2a98-4990-ade8-3c083f6c8372/kube-rbac-proxy/0.log" Apr 22 14:47:35.919044 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:35.919018 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-admission-webhook-57cf98b594-bj7j9_6d2ba58b-9ecb-433c-9cd2-adebb69b6d62/prometheus-operator-admission-webhook/0.log" Apr 22 14:47:36.045311 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:36.045256 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b7dd87cb-9s2ws_ed392dfd-aa36-4d8c-a598-160a741281e8/thanos-query/0.log" Apr 22 14:47:36.068925 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:36.068889 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b7dd87cb-9s2ws_ed392dfd-aa36-4d8c-a598-160a741281e8/kube-rbac-proxy-web/0.log" Apr 22 14:47:36.091975 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:36.091945 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b7dd87cb-9s2ws_ed392dfd-aa36-4d8c-a598-160a741281e8/kube-rbac-proxy/0.log" Apr 22 14:47:36.119143 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:36.119068 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b7dd87cb-9s2ws_ed392dfd-aa36-4d8c-a598-160a741281e8/prom-label-proxy/0.log" Apr 22 14:47:36.141196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:36.141161 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b7dd87cb-9s2ws_ed392dfd-aa36-4d8c-a598-160a741281e8/kube-rbac-proxy-rules/0.log" Apr 22 14:47:36.160566 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:36.160452 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_thanos-querier-57b7dd87cb-9s2ws_ed392dfd-aa36-4d8c-a598-160a741281e8/kube-rbac-proxy-metrics/0.log" Apr 22 14:47:38.374541 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.374506 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-kfnwb_8fdb260a-3bb8-4141-8359-e18230a3d1ee/download-server/0.log" Apr 22 14:47:38.742050 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.742010 2578 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh"] Apr 22 14:47:38.749591 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.749557 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.755368 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.755336 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh"] Apr 22 14:47:38.809755 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.809717 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-podres\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.809946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.809763 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-sys\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.809946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.809846 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-proc\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.809946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.809878 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-lib-modules\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.809946 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.809918 2578 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7lqjb\" (UniqueName: \"kubernetes.io/projected/88c93b40-e7db-4301-80fe-5e65e1077b11-kube-api-access-7lqjb\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.874643 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.874614 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-bm8bj_f204bbda-e891-46ec-b7ae-0fa017516505/volume-data-source-validator/0.log" Apr 22 14:47:38.911157 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911116 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-proc\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911412 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-proc\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911456 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-lib-modules\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911517 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7lqjb\" (UniqueName: \"kubernetes.io/projected/88c93b40-e7db-4301-80fe-5e65e1077b11-kube-api-access-7lqjb\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911583 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-lib-modules\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911610 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-podres\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911643 2578 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-sys\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911772 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-sys\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.911927 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.911881 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/88c93b40-e7db-4301-80fe-5e65e1077b11-podres\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:38.920417 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:38.920386 2578 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7lqjb\" (UniqueName: \"kubernetes.io/projected/88c93b40-e7db-4301-80fe-5e65e1077b11-kube-api-access-7lqjb\") pod \"perf-node-gather-daemonset-vsjbh\" (UID: \"88c93b40-e7db-4301-80fe-5e65e1077b11\") " pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:39.062893 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.062813 2578 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:39.209504 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.209465 2578 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh"] Apr 22 14:47:39.214202 ip-10-0-132-130 kubenswrapper[2578]: W0422 14:47:39.214166 2578 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-pod88c93b40_e7db_4301_80fe_5e65e1077b11.slice/crio-044de339c8a3736a3c90f10f13d38198c192e7ca3b3f1e02a1310f5748111278 WatchSource:0}: Error finding container 044de339c8a3736a3c90f10f13d38198c192e7ca3b3f1e02a1310f5748111278: Status 404 returned error can't find the container with id 044de339c8a3736a3c90f10f13d38198c192e7ca3b3f1e02a1310f5748111278 Apr 22 14:47:39.665886 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.665857 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kxldt_46042ad7-7ef3-4e0b-88e4-d9d9077a34d0/dns/0.log" Apr 22 14:47:39.684578 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.684552 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-kxldt_46042ad7-7ef3-4e0b-88e4-d9d9077a34d0/kube-rbac-proxy/0.log" Apr 22 14:47:39.752484 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.752454 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-wbp94_0a26fb71-5407-413c-a14c-18f3085f4abf/dns-node-resolver/0.log" Apr 22 14:47:39.941453 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.941360 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" event={"ID":"88c93b40-e7db-4301-80fe-5e65e1077b11","Type":"ContainerStarted","Data":"0a3bec8e5059d0c68185b3dcd963a8dca6cbd5bd67994765401a0ae456675449"} Apr 22 14:47:39.941453 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.941408 2578 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" event={"ID":"88c93b40-e7db-4301-80fe-5e65e1077b11","Type":"ContainerStarted","Data":"044de339c8a3736a3c90f10f13d38198c192e7ca3b3f1e02a1310f5748111278"} Apr 22 14:47:39.942196 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.942172 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:39.962038 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:39.961978 2578 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" podStartSLOduration=1.961960238 podStartE2EDuration="1.961960238s" podCreationTimestamp="2026-04-22 14:47:38 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-22 14:47:39.960526596 +0000 UTC m=+1960.635329862" watchObservedRunningTime="2026-04-22 14:47:39.961960238 +0000 UTC m=+1960.636763503" Apr 22 14:47:40.354274 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:40.354152 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-mnpw8_de390571-ad59-463e-84ce-017e582c71b4/node-ca/0.log" Apr 22 14:47:41.256648 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:41.256618 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-56d64c8c4-q4z5h_817ce9fb-bd94-4e0f-bbbc-cc50e1ef3bf6/router/0.log" Apr 22 14:47:41.722773 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:41.722742 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-xhmzs_52ec1912-fa03-4d61-8364-c8cc1159fcb5/serve-healthcheck-canary/0.log" Apr 22 14:47:42.224096 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:42.224070 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dtswc_f5aeece0-3eea-4cf5-8d59-cf4520bff33c/kube-rbac-proxy/0.log" Apr 22 14:47:42.242426 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:42.242395 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dtswc_f5aeece0-3eea-4cf5-8d59-cf4520bff33c/exporter/0.log" Apr 22 14:47:42.262781 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:42.262754 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-dtswc_f5aeece0-3eea-4cf5-8d59-cf4520bff33c/extractor/0.log" Apr 22 14:47:44.816733 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:44.816701 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-879f8864c-2gqzx_5dfaf3df-2a3a-41a0-abf2-cfd57af634aa/manager/0.log" Apr 22 14:47:44.862836 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:44.862806 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_openshift-lws-operator-bfc7f696d-zf482_9fefa1c0-f5be-444c-927a-f862a64444c6/openshift-lws-operator/0.log" Apr 22 14:47:46.964498 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:46.964471 2578 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-54rh7/perf-node-gather-daemonset-vsjbh" Apr 22 14:47:52.484346 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.484318 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/kube-multus-additional-cni-plugins/0.log" Apr 22 14:47:52.512029 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.511986 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/egress-router-binary-copy/0.log" Apr 22 14:47:52.532333 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.532290 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/cni-plugins/0.log" Apr 22 14:47:52.550086 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.550042 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/bond-cni-plugin/0.log" Apr 22 14:47:52.566181 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.566154 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/routeoverride-cni/0.log" Apr 22 14:47:52.610937 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.610891 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/whereabouts-cni-bincopy/0.log" Apr 22 14:47:52.624429 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.624388 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-z67br_0e41ada6-0c6f-4380-8fc5-ac4005a2c30b/whereabouts-cni/0.log" Apr 22 14:47:52.685239 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.685205 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-qnk5j_20f9c88a-aaca-401a-b81d-d9a32b00d92a/kube-multus/0.log" Apr 22 14:47:52.751906 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.751821 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8q2mm_aee73a14-6669-4d65-8987-69628270ae6d/network-metrics-daemon/0.log" Apr 22 14:47:52.770641 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:52.770554 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-8q2mm_aee73a14-6669-4d65-8987-69628270ae6d/kube-rbac-proxy/0.log" Apr 22 14:47:53.573589 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.573554 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-controller/0.log" Apr 22 14:47:53.587151 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.587100 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/0.log" Apr 22 14:47:53.605886 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.605858 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovn-acl-logging/1.log" Apr 22 14:47:53.638249 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.638220 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/kube-rbac-proxy-node/0.log" Apr 22 14:47:53.686758 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.686726 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/kube-rbac-proxy-ovn-metrics/0.log" Apr 22 14:47:53.717934 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.717902 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/northd/0.log" Apr 22 14:47:53.736005 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.735977 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/nbdb/0.log" Apr 22 14:47:53.756820 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.756795 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/sbdb/0.log" Apr 22 14:47:53.949050 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:53.949000 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-2kpzl_5678ae75-291c-4f06-82ee-c0d558cb29dc/ovnkube-controller/0.log" Apr 22 14:47:55.844473 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:55.844445 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-diagnostics_network-check-target-9kmw7_177ed1a3-5a44-405b-a340-c5e0c5655232/network-check-target-container/0.log" Apr 22 14:47:56.861540 ip-10-0-132-130 kubenswrapper[2578]: I0422 14:47:56.861467 2578 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-network-operator_iptables-alerter-pcvj2_d2d20229-b414-4e83-be51-e3c0fd756697/iptables-alerter/0.log"