Apr 23 13:31:19.135656 ip-10-0-135-229 systemd[1]: Starting Kubernetes Kubelet... Apr 23 13:31:19.571291 ip-10-0-135-229 kubenswrapper[2576]: Flag --container-runtime-endpoint has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:19.571291 ip-10-0-135-229 kubenswrapper[2576]: Flag --minimum-container-ttl-duration has been deprecated, Use --eviction-hard or --eviction-soft instead. Will be removed in a future version. Apr 23 13:31:19.571291 ip-10-0-135-229 kubenswrapper[2576]: Flag --volume-plugin-dir has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:19.571291 ip-10-0-135-229 kubenswrapper[2576]: Flag --pod-infra-container-image has been deprecated, will be removed in 1.35. Image garbage collector will get sandbox image information from CRI. Apr 23 13:31:19.571291 ip-10-0-135-229 kubenswrapper[2576]: Flag --system-reserved has been deprecated, This parameter should be set via the config file specified by the Kubelet's --config flag. See https://kubernetes.io/docs/tasks/administer-cluster/kubelet-config-file/ for more information. Apr 23 13:31:19.574057 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.573974 2576 server.go:212] "--pod-infra-container-image will not be pruned by the image garbage collector in kubelet and should also be set in the remote runtime" Apr 23 13:31:19.578314 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578300 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578316 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578320 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578324 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578339 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578342 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578345 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578348 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578350 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578353 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578356 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578358 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578361 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578363 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578366 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:19.578363 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578369 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578372 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578375 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578377 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578380 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578383 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578386 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578388 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578391 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578393 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578395 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578398 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578400 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578403 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578405 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578408 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578410 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578413 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578415 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:19.578712 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578418 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578420 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578422 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578425 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578427 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578430 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578432 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578435 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578438 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578440 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578442 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578445 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578447 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578451 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578453 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578457 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578460 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578463 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578466 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578469 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:19.579146 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578473 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578475 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578478 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578481 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578483 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578486 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578488 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578491 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578493 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578496 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578498 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578501 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578503 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578505 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578508 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578510 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578521 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578524 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578527 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:19.579632 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578529 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578531 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578534 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578536 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578539 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578541 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578544 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578546 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578549 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578553 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578556 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578559 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:19.580075 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.578561 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580114 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580120 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580124 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580127 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580130 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580132 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580135 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580138 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580140 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580143 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580145 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580148 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580150 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580153 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580157 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580160 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580165 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580168 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580171 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:19.580381 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580174 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580177 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580180 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580183 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580185 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580188 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580191 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580194 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580196 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580199 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580201 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580204 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580206 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580208 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580211 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580214 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580217 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580219 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580221 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:19.580935 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580224 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580226 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580229 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580231 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580234 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580236 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580239 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580241 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580243 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580246 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580249 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580252 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580254 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580256 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580259 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580262 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580264 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580267 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580269 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580273 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:19.581426 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580275 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580278 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580280 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580282 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580285 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580287 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580289 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580292 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580294 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580297 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580299 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580301 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580304 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580308 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580310 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580313 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580315 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580318 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580320 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580322 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:19.581910 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580336 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580338 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580341 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580343 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580346 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580349 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580351 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.580355 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580422 2576 flags.go:64] FLAG: --address="0.0.0.0" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580429 2576 flags.go:64] FLAG: --allowed-unsafe-sysctls="[]" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580436 2576 flags.go:64] FLAG: --anonymous-auth="true" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580441 2576 flags.go:64] FLAG: --application-metrics-count-limit="100" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580445 2576 flags.go:64] FLAG: --authentication-token-webhook="false" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580449 2576 flags.go:64] FLAG: --authentication-token-webhook-cache-ttl="2m0s" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580453 2576 flags.go:64] FLAG: --authorization-mode="AlwaysAllow" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580458 2576 flags.go:64] FLAG: --authorization-webhook-cache-authorized-ttl="5m0s" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580461 2576 flags.go:64] FLAG: --authorization-webhook-cache-unauthorized-ttl="30s" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580464 2576 flags.go:64] FLAG: --boot-id-file="/proc/sys/kernel/random/boot_id" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580468 2576 flags.go:64] FLAG: --bootstrap-kubeconfig="/etc/kubernetes/kubeconfig" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580471 2576 flags.go:64] FLAG: --cert-dir="/var/lib/kubelet/pki" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580474 2576 flags.go:64] FLAG: --cgroup-driver="cgroupfs" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580477 2576 flags.go:64] FLAG: --cgroup-root="" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580480 2576 flags.go:64] FLAG: --cgroups-per-qos="true" Apr 23 13:31:19.582392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580483 2576 flags.go:64] FLAG: --client-ca-file="" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580486 2576 flags.go:64] FLAG: --cloud-config="" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580489 2576 flags.go:64] FLAG: --cloud-provider="external" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580491 2576 flags.go:64] FLAG: --cluster-dns="[]" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580496 2576 flags.go:64] FLAG: --cluster-domain="" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580499 2576 flags.go:64] FLAG: --config="/etc/kubernetes/kubelet.conf" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580502 2576 flags.go:64] FLAG: --config-dir="" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580505 2576 flags.go:64] FLAG: --container-hints="/etc/cadvisor/container_hints.json" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580508 2576 flags.go:64] FLAG: --container-log-max-files="5" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580512 2576 flags.go:64] FLAG: --container-log-max-size="10Mi" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580515 2576 flags.go:64] FLAG: --container-runtime-endpoint="/var/run/crio/crio.sock" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580519 2576 flags.go:64] FLAG: --containerd="/run/containerd/containerd.sock" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580522 2576 flags.go:64] FLAG: --containerd-namespace="k8s.io" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580525 2576 flags.go:64] FLAG: --contention-profiling="false" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580528 2576 flags.go:64] FLAG: --cpu-cfs-quota="true" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580531 2576 flags.go:64] FLAG: --cpu-cfs-quota-period="100ms" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580535 2576 flags.go:64] FLAG: --cpu-manager-policy="none" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580538 2576 flags.go:64] FLAG: --cpu-manager-policy-options="" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580542 2576 flags.go:64] FLAG: --cpu-manager-reconcile-period="10s" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580545 2576 flags.go:64] FLAG: --enable-controller-attach-detach="true" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580548 2576 flags.go:64] FLAG: --enable-debugging-handlers="true" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580551 2576 flags.go:64] FLAG: --enable-load-reader="false" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580554 2576 flags.go:64] FLAG: --enable-server="true" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580557 2576 flags.go:64] FLAG: --enforce-node-allocatable="[pods]" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580561 2576 flags.go:64] FLAG: --event-burst="100" Apr 23 13:31:19.582935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580565 2576 flags.go:64] FLAG: --event-qps="50" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580567 2576 flags.go:64] FLAG: --event-storage-age-limit="default=0" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580570 2576 flags.go:64] FLAG: --event-storage-event-limit="default=0" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580573 2576 flags.go:64] FLAG: --eviction-hard="" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580577 2576 flags.go:64] FLAG: --eviction-max-pod-grace-period="0" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580580 2576 flags.go:64] FLAG: --eviction-minimum-reclaim="" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580583 2576 flags.go:64] FLAG: --eviction-pressure-transition-period="5m0s" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580586 2576 flags.go:64] FLAG: --eviction-soft="" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580589 2576 flags.go:64] FLAG: --eviction-soft-grace-period="" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580591 2576 flags.go:64] FLAG: --exit-on-lock-contention="false" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580594 2576 flags.go:64] FLAG: --experimental-allocatable-ignore-eviction="false" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580597 2576 flags.go:64] FLAG: --experimental-mounter-path="" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580600 2576 flags.go:64] FLAG: --fail-cgroupv1="false" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580602 2576 flags.go:64] FLAG: --fail-swap-on="true" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580605 2576 flags.go:64] FLAG: --feature-gates="" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580609 2576 flags.go:64] FLAG: --file-check-frequency="20s" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580611 2576 flags.go:64] FLAG: --global-housekeeping-interval="1m0s" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580614 2576 flags.go:64] FLAG: --hairpin-mode="promiscuous-bridge" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580618 2576 flags.go:64] FLAG: --healthz-bind-address="127.0.0.1" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580621 2576 flags.go:64] FLAG: --healthz-port="10248" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580624 2576 flags.go:64] FLAG: --help="false" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580627 2576 flags.go:64] FLAG: --hostname-override="ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580630 2576 flags.go:64] FLAG: --housekeeping-interval="10s" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580633 2576 flags.go:64] FLAG: --http-check-frequency="20s" Apr 23 13:31:19.583553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580636 2576 flags.go:64] FLAG: --image-credential-provider-bin-dir="/usr/libexec/kubelet-image-credential-provider-plugins" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580640 2576 flags.go:64] FLAG: --image-credential-provider-config="/etc/kubernetes/credential-providers/ecr-credential-provider.yaml" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580643 2576 flags.go:64] FLAG: --image-gc-high-threshold="85" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580646 2576 flags.go:64] FLAG: --image-gc-low-threshold="80" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580649 2576 flags.go:64] FLAG: --image-service-endpoint="" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580652 2576 flags.go:64] FLAG: --kernel-memcg-notification="false" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580655 2576 flags.go:64] FLAG: --kube-api-burst="100" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580657 2576 flags.go:64] FLAG: --kube-api-content-type="application/vnd.kubernetes.protobuf" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580660 2576 flags.go:64] FLAG: --kube-api-qps="50" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580663 2576 flags.go:64] FLAG: --kube-reserved="" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580666 2576 flags.go:64] FLAG: --kube-reserved-cgroup="" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580669 2576 flags.go:64] FLAG: --kubeconfig="/var/lib/kubelet/kubeconfig" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580672 2576 flags.go:64] FLAG: --kubelet-cgroups="" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580674 2576 flags.go:64] FLAG: --local-storage-capacity-isolation="true" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580677 2576 flags.go:64] FLAG: --lock-file="" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580680 2576 flags.go:64] FLAG: --log-cadvisor-usage="false" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580682 2576 flags.go:64] FLAG: --log-flush-frequency="5s" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580685 2576 flags.go:64] FLAG: --log-json-info-buffer-size="0" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580694 2576 flags.go:64] FLAG: --log-json-split-stream="false" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580697 2576 flags.go:64] FLAG: --log-text-info-buffer-size="0" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580700 2576 flags.go:64] FLAG: --log-text-split-stream="false" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580703 2576 flags.go:64] FLAG: --logging-format="text" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580706 2576 flags.go:64] FLAG: --machine-id-file="/etc/machine-id,/var/lib/dbus/machine-id" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580709 2576 flags.go:64] FLAG: --make-iptables-util-chains="true" Apr 23 13:31:19.584101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580712 2576 flags.go:64] FLAG: --manifest-url="" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580714 2576 flags.go:64] FLAG: --manifest-url-header="" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580719 2576 flags.go:64] FLAG: --max-housekeeping-interval="15s" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580722 2576 flags.go:64] FLAG: --max-open-files="1000000" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580726 2576 flags.go:64] FLAG: --max-pods="110" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580728 2576 flags.go:64] FLAG: --maximum-dead-containers="-1" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580731 2576 flags.go:64] FLAG: --maximum-dead-containers-per-container="1" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580737 2576 flags.go:64] FLAG: --memory-manager-policy="None" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580740 2576 flags.go:64] FLAG: --minimum-container-ttl-duration="6m0s" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580743 2576 flags.go:64] FLAG: --minimum-image-ttl-duration="2m0s" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580746 2576 flags.go:64] FLAG: --node-ip="0.0.0.0" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580750 2576 flags.go:64] FLAG: --node-labels="node-role.kubernetes.io/worker=,node.openshift.io/os_id=rhel" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580757 2576 flags.go:64] FLAG: --node-status-max-images="50" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580760 2576 flags.go:64] FLAG: --node-status-update-frequency="10s" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580763 2576 flags.go:64] FLAG: --oom-score-adj="-999" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580766 2576 flags.go:64] FLAG: --pod-cidr="" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580769 2576 flags.go:64] FLAG: --pod-infra-container-image="quay.io/openshift-release-dev/ocp-v4.0-art-dev@sha256:c8cfe89231412ff3ee8cb6207fa0be33cad0f08e88c9c0f1e9f7e8c6f14d6715" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580774 2576 flags.go:64] FLAG: --pod-manifest-path="" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580777 2576 flags.go:64] FLAG: --pod-max-pids="-1" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580780 2576 flags.go:64] FLAG: --pods-per-core="0" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580782 2576 flags.go:64] FLAG: --port="10250" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580785 2576 flags.go:64] FLAG: --protect-kernel-defaults="false" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580788 2576 flags.go:64] FLAG: --provider-id="aws:///us-east-1a/i-002a6bd3906fd3167" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580791 2576 flags.go:64] FLAG: --qos-reserved="" Apr 23 13:31:19.584668 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580794 2576 flags.go:64] FLAG: --read-only-port="10255" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580797 2576 flags.go:64] FLAG: --register-node="true" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580800 2576 flags.go:64] FLAG: --register-schedulable="true" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580803 2576 flags.go:64] FLAG: --register-with-taints="" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580806 2576 flags.go:64] FLAG: --registry-burst="10" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580816 2576 flags.go:64] FLAG: --registry-qps="5" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580819 2576 flags.go:64] FLAG: --reserved-cpus="" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580822 2576 flags.go:64] FLAG: --reserved-memory="" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580825 2576 flags.go:64] FLAG: --resolv-conf="/etc/resolv.conf" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580828 2576 flags.go:64] FLAG: --root-dir="/var/lib/kubelet" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580831 2576 flags.go:64] FLAG: --rotate-certificates="false" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580836 2576 flags.go:64] FLAG: --rotate-server-certificates="false" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580839 2576 flags.go:64] FLAG: --runonce="false" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580842 2576 flags.go:64] FLAG: --runtime-cgroups="/system.slice/crio.service" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580845 2576 flags.go:64] FLAG: --runtime-request-timeout="2m0s" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580848 2576 flags.go:64] FLAG: --seccomp-default="false" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580855 2576 flags.go:64] FLAG: --serialize-image-pulls="true" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580858 2576 flags.go:64] FLAG: --storage-driver-buffer-duration="1m0s" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580862 2576 flags.go:64] FLAG: --storage-driver-db="cadvisor" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580865 2576 flags.go:64] FLAG: --storage-driver-host="localhost:8086" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580868 2576 flags.go:64] FLAG: --storage-driver-password="root" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580871 2576 flags.go:64] FLAG: --storage-driver-secure="false" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580874 2576 flags.go:64] FLAG: --storage-driver-table="stats" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580876 2576 flags.go:64] FLAG: --storage-driver-user="root" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580879 2576 flags.go:64] FLAG: --streaming-connection-idle-timeout="4h0m0s" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580882 2576 flags.go:64] FLAG: --sync-frequency="1m0s" Apr 23 13:31:19.585248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580885 2576 flags.go:64] FLAG: --system-cgroups="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580887 2576 flags.go:64] FLAG: --system-reserved="cpu=500m,ephemeral-storage=1Gi,memory=1Gi" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580893 2576 flags.go:64] FLAG: --system-reserved-cgroup="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580896 2576 flags.go:64] FLAG: --tls-cert-file="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580899 2576 flags.go:64] FLAG: --tls-cipher-suites="[]" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580902 2576 flags.go:64] FLAG: --tls-min-version="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580905 2576 flags.go:64] FLAG: --tls-private-key-file="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580908 2576 flags.go:64] FLAG: --topology-manager-policy="none" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580911 2576 flags.go:64] FLAG: --topology-manager-policy-options="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580913 2576 flags.go:64] FLAG: --topology-manager-scope="container" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580916 2576 flags.go:64] FLAG: --v="2" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580920 2576 flags.go:64] FLAG: --version="false" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580924 2576 flags.go:64] FLAG: --vmodule="" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580928 2576 flags.go:64] FLAG: --volume-plugin-dir="/etc/kubernetes/kubelet-plugins/volume/exec" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.580931 2576 flags.go:64] FLAG: --volume-stats-agg-period="1m0s" Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581020 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581023 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581028 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581031 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581034 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581037 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581039 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581043 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:19.585881 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581046 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581049 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581051 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581054 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581056 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581059 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581062 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581064 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581066 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581069 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581071 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581074 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581076 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581079 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581081 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581084 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581086 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581088 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581091 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:19.586736 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581093 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581096 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581099 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581101 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581103 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581106 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581108 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581112 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581114 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581117 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581119 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581122 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581125 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581128 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581131 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581134 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581136 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581139 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581141 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581144 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:19.587564 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581146 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581149 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581151 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581154 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581156 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581158 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581161 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581163 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581166 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581169 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581171 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581175 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581179 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581181 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581184 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581186 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581188 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581191 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581193 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581198 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:19.588304 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581200 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581202 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581206 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581210 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581215 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581218 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581221 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581227 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581230 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581233 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581236 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581238 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581241 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581244 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581247 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581249 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581252 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581254 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:19.588815 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.581257 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.581265 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.588861 2576 server.go:530] "Kubelet version" kubeletVersion="v1.33.9" Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.589015 2576 server.go:532] "Golang settings" GOGC="" GOMAXPROCS="" GOTRACEBACK="" Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589100 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589108 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589113 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589118 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589122 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589127 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589131 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589135 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589139 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589146 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589152 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589157 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:19.589550 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589162 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589166 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589171 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589175 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589180 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589188 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589193 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589197 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589202 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589206 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589210 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589214 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589218 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589222 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589226 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589230 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589234 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589238 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589242 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589247 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:19.590225 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589251 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589257 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589263 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589268 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589272 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589276 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589280 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589284 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589288 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589292 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589296 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589300 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589304 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589308 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589312 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589316 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589320 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589324 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589345 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:19.590761 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589350 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589355 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589359 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589363 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589367 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589371 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589375 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589379 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589383 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589387 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589390 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589395 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589399 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589404 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589409 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589413 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589417 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589421 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589424 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589429 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:19.591382 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589433 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589437 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589441 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589445 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589449 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589453 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589457 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589461 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589465 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589468 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589473 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589477 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589483 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589487 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589491 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:19.592180 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.589499 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589650 2576 feature_gate.go:328] unrecognized feature gate: NewOLM Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589658 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDCWithUIDAndExtraClaimMappings Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589662 2576 feature_gate.go:328] unrecognized feature gate: NetworkDiagnosticsConfig Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589667 2576 feature_gate.go:328] unrecognized feature gate: BootcNodeManagement Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589672 2576 feature_gate.go:328] unrecognized feature gate: NutanixMultiSubnets Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589676 2576 feature_gate.go:328] unrecognized feature gate: ExternalSnapshotMetadata Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589680 2576 feature_gate.go:328] unrecognized feature gate: ShortCertRotation Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589684 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiNetworks Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589689 2576 feature_gate.go:328] unrecognized feature gate: VSphereHostVMGroupZonal Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589694 2576 feature_gate.go:328] unrecognized feature gate: DyanmicServiceEndpointIBMCloud Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589698 2576 feature_gate.go:328] unrecognized feature gate: NetworkLiveMigration Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589702 2576 feature_gate.go:328] unrecognized feature gate: ImageStreamImportMode Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589706 2576 feature_gate.go:328] unrecognized feature gate: UpgradeStatus Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589710 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPI Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589715 2576 feature_gate.go:328] unrecognized feature gate: OpenShiftPodSecurityAdmission Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589719 2576 feature_gate.go:328] unrecognized feature gate: MetricsCollectionProfiles Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589723 2576 feature_gate.go:328] unrecognized feature gate: NewOLMWebhookProviderOpenshiftServiceCA Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589727 2576 feature_gate.go:328] unrecognized feature gate: MultiArchInstallAzure Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589731 2576 feature_gate.go:328] unrecognized feature gate: MultiDiskSetup Apr 23 13:31:19.592782 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589735 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAzure Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589739 2576 feature_gate.go:328] unrecognized feature gate: AdminNetworkPolicy Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589743 2576 feature_gate.go:328] unrecognized feature gate: VSphereMixedNodeEnv Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589747 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstallIBMCloud Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589751 2576 feature_gate.go:328] unrecognized feature gate: AutomatedEtcdBackup Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589755 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIMigration Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589759 2576 feature_gate.go:328] unrecognized feature gate: AzureDedicatedHosts Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589764 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfig Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589768 2576 feature_gate.go:328] unrecognized feature gate: ClusterMonitoringConfig Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589772 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerLBSubnetsAWS Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589777 2576 feature_gate.go:328] unrecognized feature gate: SignatureStores Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589781 2576 feature_gate.go:328] unrecognized feature gate: ImageModeStatusReporting Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589785 2576 feature_gate.go:328] unrecognized feature gate: HighlyAvailableArbiter Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589789 2576 feature_gate.go:328] unrecognized feature gate: NewOLMCatalogdAPIV1Metas Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589793 2576 feature_gate.go:328] unrecognized feature gate: PinnedImages Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589797 2576 feature_gate.go:328] unrecognized feature gate: InsightsOnDemandDataGather Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589801 2576 feature_gate.go:328] unrecognized feature gate: InsightsConfigAPI Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589805 2576 feature_gate.go:328] unrecognized feature gate: DualReplica Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589809 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNSInstall Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589814 2576 feature_gate.go:328] unrecognized feature gate: AzureClusterHostedDNSInstall Apr 23 13:31:19.593351 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589818 2576 feature_gate.go:328] unrecognized feature gate: ExternalOIDC Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589821 2576 feature_gate.go:328] unrecognized feature gate: AWSDedicatedHosts Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589826 2576 feature_gate.go:328] unrecognized feature gate: IngressControllerDynamicConfigurationManager Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589830 2576 feature_gate.go:328] unrecognized feature gate: NetworkSegmentation Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589834 2576 feature_gate.go:328] unrecognized feature gate: VolumeGroupSnapshot Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589838 2576 feature_gate.go:328] unrecognized feature gate: Example2 Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589842 2576 feature_gate.go:328] unrecognized feature gate: EtcdBackendQuota Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589846 2576 feature_gate.go:328] unrecognized feature gate: SetEIPForNLBIngressController Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589851 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesAWS Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589855 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerification Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589859 2576 feature_gate.go:328] unrecognized feature gate: NewOLMPreflightPermissionChecks Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589862 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpointsInstall Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589874 2576 feature_gate.go:328] unrecognized feature gate: Example Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589880 2576 feature_gate.go:328] unrecognized feature gate: NoRegistryClusterOperations Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589884 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNSInstall Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589888 2576 feature_gate.go:328] unrecognized feature gate: AlibabaPlatform Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589892 2576 feature_gate.go:328] unrecognized feature gate: MachineConfigNodes Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589896 2576 feature_gate.go:328] unrecognized feature gate: VSphereMultiDisk Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589900 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImagesvSphere Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589904 2576 feature_gate.go:328] unrecognized feature gate: AdditionalRoutingCapabilities Apr 23 13:31:19.593908 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589910 2576 feature_gate.go:349] Setting deprecated feature gate KMSv1=true. It will be removed in a future release. Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589917 2576 feature_gate.go:328] unrecognized feature gate: AzureMultiDisk Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589921 2576 feature_gate.go:328] unrecognized feature gate: IrreconcilableMachineConfig Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589926 2576 feature_gate.go:328] unrecognized feature gate: NewOLMOwnSingleNamespace Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589931 2576 feature_gate.go:328] unrecognized feature gate: CPMSMachineNamePrefix Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589938 2576 feature_gate.go:351] Setting GA feature gate ServiceAccountTokenNodeBinding=true. It will be removed in a future release. Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589943 2576 feature_gate.go:328] unrecognized feature gate: KMSEncryptionProvider Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589948 2576 feature_gate.go:328] unrecognized feature gate: MachineAPIOperatorDisableMachineHealthCheckController Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589953 2576 feature_gate.go:328] unrecognized feature gate: GCPCustomAPIEndpoints Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589958 2576 feature_gate.go:328] unrecognized feature gate: BootImageSkewEnforcement Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589962 2576 feature_gate.go:328] unrecognized feature gate: PreconfiguredUDNAddresses Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589966 2576 feature_gate.go:328] unrecognized feature gate: MixedCPUsAllocation Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589970 2576 feature_gate.go:328] unrecognized feature gate: AWSClusterHostedDNS Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589974 2576 feature_gate.go:328] unrecognized feature gate: VSphereConfigurableMaxAllowedBlockVolumesPerNode Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589979 2576 feature_gate.go:328] unrecognized feature gate: ClusterVersionOperatorConfiguration Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589983 2576 feature_gate.go:328] unrecognized feature gate: DNSNameResolver Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589987 2576 feature_gate.go:328] unrecognized feature gate: AzureWorkloadIdentity Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589991 2576 feature_gate.go:328] unrecognized feature gate: GCPClusterHostedDNS Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589995 2576 feature_gate.go:328] unrecognized feature gate: GatewayAPIController Apr 23 13:31:19.594445 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.589999 2576 feature_gate.go:328] unrecognized feature gate: OVNObservability Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590002 2576 feature_gate.go:328] unrecognized feature gate: ClusterAPIInstall Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590006 2576 feature_gate.go:328] unrecognized feature gate: SigstoreImageVerificationPKI Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590011 2576 feature_gate.go:328] unrecognized feature gate: BuildCSIVolumes Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590014 2576 feature_gate.go:328] unrecognized feature gate: ConsolePluginContentSecurityPolicy Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590018 2576 feature_gate.go:328] unrecognized feature gate: AWSServiceLBNetworkSecurityGroup Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590022 2576 feature_gate.go:328] unrecognized feature gate: ManagedBootImages Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:19.590026 2576 feature_gate.go:328] unrecognized feature gate: RouteAdvertisements Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.590034 2576 feature_gate.go:384] feature gates: {map[DynamicResourceAllocation:false EventedPLEG:false ImageVolume:true KMSv1:true MaxUnavailableStatefulSet:false MinimumKubeletVersion:false MutatingAdmissionPolicy:false NodeSwap:false ProcMountType:true RouteExternalCertificate:true SELinuxMount:false ServiceAccountTokenNodeBinding:true StoragePerformantSecurityPolicy:true TranslateStreamCloseWebsocketRequests:false UserNamespacesPodSecurityStandards:true UserNamespacesSupport:true VolumeAttributesClass:false]} Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.590712 2576 server.go:962] "Client rotation is on, will bootstrap in background" Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.593227 2576 bootstrap.go:101] "Use the bootstrap credentials to request a cert, and set kubeconfig to point to the certificate dir" Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.594242 2576 server.go:1019] "Starting client certificate rotation" Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.594350 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:19.594893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.594386 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kube-apiserver-client-kubelet" Apr 23 13:31:19.619204 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.619181 2576 dynamic_cafile_content.go:123] "Loaded a new CA Bundle and Verifier" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:19.622134 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.622105 2576 dynamic_cafile_content.go:161] "Starting controller" name="client-ca-bundle::/etc/kubernetes/kubelet-ca.crt" Apr 23 13:31:19.635943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.635928 2576 log.go:25] "Validated CRI v1 runtime API" Apr 23 13:31:19.641472 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.641457 2576 log.go:25] "Validated CRI v1 image API" Apr 23 13:31:19.644787 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.644772 2576 server.go:1452] "Using cgroup driver setting received from the CRI runtime" cgroupDriver="systemd" Apr 23 13:31:19.648900 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.648880 2576 fs.go:135] Filesystem UUIDs: map[416a60ac-3ad2-4a69-95de-64c445e42cc3:/dev/nvme0n1p3 75f4d7f8-845e-4765-8134-d3328142358a:/dev/nvme0n1p4 7B77-95E7:/dev/nvme0n1p2] Apr 23 13:31:19.648957 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.648898 2576 fs.go:136] Filesystem partitions: map[/dev/nvme0n1p3:{mountpoint:/boot major:259 minor:3 fsType:ext4 blockSize:0} /dev/nvme0n1p4:{mountpoint:/var major:259 minor:4 fsType:xfs blockSize:0} /dev/shm:{mountpoint:/dev/shm major:0 minor:22 fsType:tmpfs blockSize:0} /run:{mountpoint:/run major:0 minor:24 fsType:tmpfs blockSize:0} /tmp:{mountpoint:/tmp major:0 minor:31 fsType:tmpfs blockSize:0} composefs_0-33:{mountpoint:/ major:0 minor:33 fsType:overlay blockSize:0}] Apr 23 13:31:19.652302 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.652284 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kube-apiserver-client-kubelet" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:19.654998 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.654893 2576 manager.go:217] Machine: {Timestamp:2026-04-23 13:31:19.652913414 +0000 UTC m=+0.400915360 CPUVendorID:GenuineIntel NumCores:8 NumPhysicalCores:4 NumSockets:1 CpuFrequency:3200209 MemoryCapacity:33164492800 SwapCapacity:0 MemoryByType:map[] NVMInfo:{MemoryModeCapacity:0 AppDirectModeCapacity:0 AvgPowerBudget:0} HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] MachineID:ec22639ffd372984bb01b8ea1afe3044 SystemUUID:ec22639f-fd37-2984-bb01-b8ea1afe3044 BootID:bf9f9c77-7c91-42f9-ad4c-039628ed3e80 Filesystems:[{Device:/dev/shm DeviceMajor:0 DeviceMinor:22 Capacity:16582246400 Type:vfs Inodes:4048400 HasInodes:true} {Device:/run DeviceMajor:0 DeviceMinor:24 Capacity:6632898560 Type:vfs Inodes:819200 HasInodes:true} {Device:/dev/nvme0n1p4 DeviceMajor:259 DeviceMinor:4 Capacity:128243970048 Type:vfs Inodes:62651840 HasInodes:true} {Device:composefs_0-33 DeviceMajor:0 DeviceMinor:33 Capacity:6103040 Type:vfs Inodes:18446744073709551615 HasInodes:true} {Device:/tmp DeviceMajor:0 DeviceMinor:31 Capacity:16582246400 Type:vfs Inodes:1048576 HasInodes:true} {Device:/dev/nvme0n1p3 DeviceMajor:259 DeviceMinor:3 Capacity:366869504 Type:vfs Inodes:98304 HasInodes:true}] DiskMap:map[259:0:{Name:nvme0n1 Major:259 Minor:0 Size:128849018880 Scheduler:none}] NetworkDevices:[{Name:br-ex MacAddress:02:86:43:ea:68:db Speed:0 Mtu:9001} {Name:ens5 MacAddress:02:86:43:ea:68:db Speed:0 Mtu:9001} {Name:ovs-system MacAddress:5a:64:57:82:30:e8 Speed:0 Mtu:1500}] Topology:[{Id:0 Memory:33164492800 HugePages:[{PageSize:1048576 NumPages:0} {PageSize:2048 NumPages:0}] Cores:[{Id:0 Threads:[0 4] Caches:[{Id:0 Size:32768 Type:Data Level:1} {Id:0 Size:32768 Type:Instruction Level:1} {Id:0 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:1 Threads:[1 5] Caches:[{Id:1 Size:32768 Type:Data Level:1} {Id:1 Size:32768 Type:Instruction Level:1} {Id:1 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:2 Threads:[2 6] Caches:[{Id:2 Size:32768 Type:Data Level:1} {Id:2 Size:32768 Type:Instruction Level:1} {Id:2 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:} {Id:3 Threads:[3 7] Caches:[{Id:3 Size:32768 Type:Data Level:1} {Id:3 Size:32768 Type:Instruction Level:1} {Id:3 Size:1048576 Type:Unified Level:2}] UncoreCaches:[] SocketID:0 BookID: DrawerID:}] Caches:[{Id:0 Size:37486592 Type:Unified Level:3}] Distances:[10]}] CloudProvider:Unknown InstanceType:Unknown InstanceID:None} Apr 23 13:31:19.654998 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.654994 2576 manager_no_libpfm.go:29] cAdvisor is build without cgo and/or libpfm support. Perf event counters are not available. Apr 23 13:31:19.655097 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.655068 2576 manager.go:233] Version: {KernelVersion:5.14.0-570.107.1.el9_6.x86_64 ContainerOsVersion:Red Hat Enterprise Linux CoreOS 9.6.20260414-0 (Plow) DockerVersion: DockerAPIVersion: CadvisorVersion: CadvisorRevision:} Apr 23 13:31:19.656033 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.656009 2576 container_manager_linux.go:270] "Container manager verified user specified cgroup-root exists" cgroupRoot=[] Apr 23 13:31:19.656161 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.656036 2576 container_manager_linux.go:275] "Creating Container Manager object based on Node Config" nodeConfig={"NodeName":"ip-10-0-135-229.ec2.internal","RuntimeCgroupsName":"/system.slice/crio.service","SystemCgroupsName":"/system.slice","KubeletCgroupsName":"","KubeletOOMScoreAdj":-999,"ContainerRuntime":"","CgroupsPerQOS":true,"CgroupRoot":"/","CgroupDriver":"systemd","KubeletRootDir":"/var/lib/kubelet","ProtectKernelDefaults":true,"KubeReservedCgroupName":"","SystemReservedCgroupName":"","ReservedSystemCPUs":{},"EnforceNodeAllocatable":{"pods":{}},"KubeReserved":null,"SystemReserved":{"cpu":"500m","ephemeral-storage":"1Gi","memory":"1Gi"},"HardEvictionThresholds":[{"Signal":"nodefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.15},"GracePeriod":0,"MinReclaim":null},{"Signal":"imagefs.inodesFree","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.05},"GracePeriod":0,"MinReclaim":null},{"Signal":"memory.available","Operator":"LessThan","Value":{"Quantity":"100Mi","Percentage":0},"GracePeriod":0,"MinReclaim":null},{"Signal":"nodefs.available","Operator":"LessThan","Value":{"Quantity":null,"Percentage":0.1},"GracePeriod":0,"MinReclaim":null}],"QOSReserved":{},"CPUManagerPolicy":"none","CPUManagerPolicyOptions":null,"TopologyManagerScope":"container","CPUManagerReconcilePeriod":10000000000,"MemoryManagerPolicy":"None","MemoryManagerReservedMemory":null,"PodPidsLimit":4096,"EnforceCPULimits":true,"CPUCFSQuotaPeriod":100000000,"TopologyManagerPolicy":"none","TopologyManagerPolicyOptions":null,"CgroupVersion":2} Apr 23 13:31:19.656202 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.656171 2576 topology_manager.go:138] "Creating topology manager with none policy" Apr 23 13:31:19.656202 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.656181 2576 container_manager_linux.go:306] "Creating device plugin manager" Apr 23 13:31:19.656202 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.656194 2576 manager.go:141] "Creating Device Plugin manager" path="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:19.656275 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.656210 2576 server.go:72] "Creating device plugin registration server" version="v1beta1" socket="/var/lib/kubelet/device-plugins/kubelet.sock" Apr 23 13:31:19.657394 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.657383 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:19.657502 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.657494 2576 server.go:1267] "Using root directory" path="/var/lib/kubelet" Apr 23 13:31:19.660158 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.660148 2576 kubelet.go:491] "Attempting to sync node with API server" Apr 23 13:31:19.660190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.660162 2576 kubelet.go:386] "Adding static pod path" path="/etc/kubernetes/manifests" Apr 23 13:31:19.660190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.660175 2576 file.go:69] "Watching path" path="/etc/kubernetes/manifests" Apr 23 13:31:19.660190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.660184 2576 kubelet.go:397] "Adding apiserver pod source" Apr 23 13:31:19.660294 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.660195 2576 apiserver.go:42] "Waiting for node sync before watching apiserver pods" Apr 23 13:31:19.661263 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.661251 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:19.661308 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.661272 2576 state_mem.go:40] "Initialized new in-memory state store for pod resource information tracking" Apr 23 13:31:19.663937 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.663919 2576 kuberuntime_manager.go:279] "Container runtime initialized" containerRuntime="cri-o" version="1.33.10-2.rhaos4.20.gita4d0894.el9" apiVersion="v1" Apr 23 13:31:19.665670 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.665656 2576 kubelet.go:953] "Not starting ClusterTrustBundle informer because we are in static kubelet mode or the ClusterTrustBundleProjection featuregate is disabled" Apr 23 13:31:19.667072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667057 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/portworx-volume" Apr 23 13:31:19.667072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667073 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/empty-dir" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667079 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/git-repo" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667085 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/host-path" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667090 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/nfs" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667095 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/secret" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667101 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/iscsi" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667134 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/downward-api" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667144 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/fc" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667152 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/configmap" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667168 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/projected" Apr 23 13:31:19.667199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.667177 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/local-volume" Apr 23 13:31:19.669019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.668994 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/csi" Apr 23 13:31:19.669209 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.669195 2576 plugins.go:616] "Loaded volume plugin" pluginName="kubernetes.io/image" Apr 23 13:31:19.673243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.673224 2576 watchdog_linux.go:99] "Systemd watchdog is not enabled" Apr 23 13:31:19.673442 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.673432 2576 server.go:1295] "Started kubelet" Apr 23 13:31:19.673610 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.673576 2576 server.go:180] "Starting to listen" address="0.0.0.0" port=10250 Apr 23 13:31:19.673696 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.673572 2576 ratelimit.go:55] "Setting rate limiting for endpoint" service="podresources" qps=100 burstTokens=10 Apr 23 13:31:19.673734 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.673696 2576 server_v1.go:47] "podresources" method="list" useActivePods=true Apr 23 13:31:19.674402 ip-10-0-135-229 systemd[1]: Started Kubernetes Kubelet. Apr 23 13:31:19.674558 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.674547 2576 server.go:317] "Adding debug handlers to kubelet server" Apr 23 13:31:19.675268 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.675106 2576 csi_plugin.go:988] Failed to contact API server when waiting for CSINode publishing: csinodes.storage.k8s.io "ip-10-0-135-229.ec2.internal" is forbidden: User "system:anonymous" cannot get resource "csinodes" in API group "storage.k8s.io" at the cluster scope Apr 23 13:31:19.675268 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.675162 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Node: nodes \"ip-10-0-135-229.ec2.internal\" is forbidden: User \"system:anonymous\" cannot list resource \"nodes\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Node" Apr 23 13:31:19.675356 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.675263 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.Service: services is forbidden: User \"system:anonymous\" cannot list resource \"services\" in API group \"\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.Service" Apr 23 13:31:19.677689 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.677674 2576 server.go:255] "Starting to serve the podresources API" endpoint="unix:/var/lib/kubelet/pod-resources/kubelet.sock" Apr 23 13:31:19.677992 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.677975 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wsvv5" Apr 23 13:31:19.681713 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.681695 2576 kubelet.go:1618] "Image garbage collection failed once. Stats initialization may not have completed yet" err="failed to get imageFs info: unable to find data in memory cache" Apr 23 13:31:19.682176 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.682157 2576 certificate_manager.go:422] "Certificate rotation is enabled" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:19.682791 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.682711 2576 fs_resource_analyzer.go:67] "Starting FS ResourceAnalyzer" Apr 23 13:31:19.683594 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683571 2576 volume_manager.go:295] "The desired_state_of_world populator starts" Apr 23 13:31:19.683594 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683590 2576 volume_manager.go:297] "Starting Kubelet Volume Manager" Apr 23 13:31:19.683754 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.682350 2576 event.go:359] "Server rejected event (will not retry!)" err="events is forbidden: User \"system:anonymous\" cannot create resource \"events\" in API group \"\" in the namespace \"default\"" event="&Event{ObjectMeta:{ip-10-0-135-229.ec2.internal.18a8ff979f2df994 default 0 0001-01-01 00:00:00 +0000 UTC map[] map[] [] [] []},InvolvedObject:ObjectReference{Kind:Node,Namespace:,Name:ip-10-0-135-229.ec2.internal,UID:ip-10-0-135-229.ec2.internal,APIVersion:,ResourceVersion:,FieldPath:,},Reason:Starting,Message:Starting kubelet.,Source:EventSource{Component:kubelet,Host:ip-10-0-135-229.ec2.internal,},FirstTimestamp:2026-04-23 13:31:19.673395604 +0000 UTC m=+0.421397553,LastTimestamp:2026-04-23 13:31:19.673395604 +0000 UTC m=+0.421397553,Count:1,Type:Normal,EventTime:0001-01-01 00:00:00 +0000 UTC,Series:nil,Action:,Related:nil,ReportingController:kubelet,ReportingInstance:ip-10-0-135-229.ec2.internal,}" Apr 23 13:31:19.683754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683639 2576 desired_state_of_world_populator.go:150] "Desired state populator starts to run" Apr 23 13:31:19.683754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683669 2576 factory.go:55] Registering systemd factory Apr 23 13:31:19.683754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683713 2576 reconstruct.go:97] "Volume reconstruction finished" Apr 23 13:31:19.683754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683722 2576 reconciler.go:26] "Reconciler: start to sync state" Apr 23 13:31:19.683754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.683739 2576 factory.go:223] Registration of the systemd container factory successfully Apr 23 13:31:19.683968 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.683936 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:19.684469 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684450 2576 factory.go:153] Registering CRI-O factory Apr 23 13:31:19.684469 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684470 2576 factory.go:223] Registration of the crio container factory successfully Apr 23 13:31:19.684618 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684525 2576 factory.go:221] Registration of the containerd container factory failed: unable to create containerd client: containerd: cannot unix dial containerd api service: dial unix /run/containerd/containerd.sock: connect: no such file or directory Apr 23 13:31:19.684618 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684549 2576 factory.go:103] Registering Raw factory Apr 23 13:31:19.684618 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684553 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kube-apiserver-client-kubelet" csr="csr-wsvv5" Apr 23 13:31:19.684618 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684564 2576 manager.go:1196] Started watching for new ooms in manager Apr 23 13:31:19.684936 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.684912 2576 manager.go:319] Starting recovery of all containers Apr 23 13:31:19.687729 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.687700 2576 reflector.go:200] "Failed to watch" err="failed to list *v1.CSIDriver: csidrivers.storage.k8s.io is forbidden: User \"system:anonymous\" cannot list resource \"csidrivers\" in API group \"storage.k8s.io\" at the cluster scope" logger="UnhandledError" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.CSIDriver" Apr 23 13:31:19.687816 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.687722 2576 controller.go:145] "Failed to ensure lease exists, will retry" err="leases.coordination.k8s.io \"ip-10-0-135-229.ec2.internal\" is forbidden: User \"system:anonymous\" cannot get resource \"leases\" in API group \"coordination.k8s.io\" in the namespace \"kube-node-lease\"" interval="200ms" Apr 23 13:31:19.694866 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.694849 2576 manager.go:324] Recovery completed Apr 23 13:31:19.699262 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.699249 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:19.701414 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.701397 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:19.701473 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.701431 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:19.701473 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.701441 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:19.701855 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.701841 2576 cpu_manager.go:222] "Starting CPU manager" policy="none" Apr 23 13:31:19.701855 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.701854 2576 cpu_manager.go:223] "Reconciling" reconcilePeriod="10s" Apr 23 13:31:19.701947 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.701878 2576 state_mem.go:36] "Initialized new in-memory state store" Apr 23 13:31:19.704015 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.704004 2576 policy_none.go:49] "None policy: Start" Apr 23 13:31:19.704056 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.704019 2576 memory_manager.go:186] "Starting memorymanager" policy="None" Apr 23 13:31:19.704056 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.704028 2576 state_mem.go:35] "Initializing new in-memory state store" Apr 23 13:31:19.740849 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.740835 2576 manager.go:341] "Starting Device Plugin manager" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.740894 2576 manager.go:517] "Failed to read data from checkpoint" err="checkpoint is not found" checkpoint="kubelet_internal_checkpoint" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.740907 2576 server.go:85] "Starting device plugin registration server" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.741134 2576 eviction_manager.go:189] "Eviction manager: starting control loop" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.741146 2576 container_log_manager.go:189] "Initializing container log rotate workers" workers=1 monitorPeriod="10s" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.741241 2576 plugin_watcher.go:51] "Plugin Watcher Start" path="/var/lib/kubelet/plugins_registry" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.741321 2576 plugin_manager.go:116] "The desired_state_of_world populator (plugin watcher) starts" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.741349 2576 plugin_manager.go:118] "Starting Kubelet Plugin Manager" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.741880 2576 eviction_manager.go:267] "eviction manager: failed to check if we have separate container filesystem. Ignoring." err="non-existent label \"crio-containers\"" Apr 23 13:31:19.785500 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.741920 2576 eviction_manager.go:292] "Eviction manager: failed to get summary stats" err="failed to get node info: node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:19.818920 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.818887 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv4" Apr 23 13:31:19.820063 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.820050 2576 kubelet_network_linux.go:49] "Initialized iptables rules." protocol="IPv6" Apr 23 13:31:19.820153 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.820077 2576 status_manager.go:230] "Starting to sync pod status with apiserver" Apr 23 13:31:19.820153 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.820103 2576 watchdog_linux.go:127] "Systemd watchdog is not enabled or the interval is invalid, so health checking will not be started." Apr 23 13:31:19.820153 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.820112 2576 kubelet.go:2451] "Starting kubelet main sync loop" Apr 23 13:31:19.820280 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.820151 2576 kubelet.go:2475] "Skipping pod synchronization" err="PLEG is not healthy: pleg has yet to be successful" Apr 23 13:31:19.822627 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.822585 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:19.841650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.841633 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:19.842620 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.842604 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:19.842670 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.842635 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:19.842670 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.842648 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:19.842670 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.842669 2576 kubelet_node_status.go:78] "Attempting to register node" node="ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.851472 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.851458 2576 kubelet_node_status.go:81] "Successfully registered node" node="ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.851518 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.851479 2576 kubelet_node_status.go:597] "Error updating node status, will retry" err="error getting node \"ip-10-0-135-229.ec2.internal\": node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:19.871529 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.871509 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:19.920473 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.920444 2576 kubelet.go:2537] "SyncLoop ADD" source="file" pods=["kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal"] Apr 23 13:31:19.920532 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.920521 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:19.922635 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.922620 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:19.922678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.922650 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:19.922678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.922659 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:19.924002 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.923990 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:19.924143 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.924129 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.924183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.924163 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:19.925139 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.925115 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:19.925227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.925144 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:19.925227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.925156 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:19.925227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.925119 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:19.925227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.925226 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:19.925396 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.925238 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:19.926534 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.926521 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.926589 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.926545 2576 kubelet_node_status.go:413] "Setting node annotation to enable volume controller attach/detach" Apr 23 13:31:19.927196 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.927180 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientMemory" Apr 23 13:31:19.927281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.927203 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasNoDiskPressure" Apr 23 13:31:19.927281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:19.927212 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeHasSufficientPID" Apr 23 13:31:19.941828 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.941805 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-229.ec2.internal\" not found" node="ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.945758 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.945742 2576 kubelet.go:3336] "No need to create a mirror pod, since failed to get node info from the cluster" err="node \"ip-10-0-135-229.ec2.internal\" not found" node="ip-10-0-135-229.ec2.internal" Apr 23 13:31:19.972518 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:19.972500 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.073097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.073042 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.085372 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.085353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5447c4ab751e776bc417eb19d7fa6547-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal\" (UID: \"5447c4ab751e776bc417eb19d7fa6547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.085430 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.085380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5ea2aeb3a492354a2bfeec0a963ac187-config\") pod \"kube-apiserver-proxy-ip-10-0-135-229.ec2.internal\" (UID: \"5ea2aeb3a492354a2bfeec0a963ac187\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.085430 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.085397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5447c4ab751e776bc417eb19d7fa6547-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal\" (UID: \"5447c4ab751e776bc417eb19d7fa6547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.173730 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.173707 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.186045 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.186029 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5ea2aeb3a492354a2bfeec0a963ac187-config\") pod \"kube-apiserver-proxy-ip-10-0-135-229.ec2.internal\" (UID: \"5ea2aeb3a492354a2bfeec0a963ac187\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.186087 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.186051 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5447c4ab751e776bc417eb19d7fa6547-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal\" (UID: \"5447c4ab751e776bc417eb19d7fa6547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.186087 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.186070 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5447c4ab751e776bc417eb19d7fa6547-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal\" (UID: \"5447c4ab751e776bc417eb19d7fa6547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.186150 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.186104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/5447c4ab751e776bc417eb19d7fa6547-var-lib-kubelet\") pod \"kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal\" (UID: \"5447c4ab751e776bc417eb19d7fa6547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.186150 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.186123 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/host-path/5ea2aeb3a492354a2bfeec0a963ac187-config\") pod \"kube-apiserver-proxy-ip-10-0-135-229.ec2.internal\" (UID: \"5ea2aeb3a492354a2bfeec0a963ac187\") " pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.186150 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.186139 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kube\" (UniqueName: \"kubernetes.io/host-path/5447c4ab751e776bc417eb19d7fa6547-etc-kube\") pod \"kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal\" (UID: \"5447c4ab751e776bc417eb19d7fa6547\") " pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.244203 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.244175 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.247833 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.247817 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:20.274316 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.274289 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.375212 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.375123 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.475647 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.475617 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.576116 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.576086 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.594534 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.594510 2576 transport.go:147] "Certificate rotation detected, shutting down client connections to start using new credentials" Apr 23 13:31:20.594652 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.594638 2576 reflector.go:556] "Warning: watch ended with error" reflector="k8s.io/client-go/informers/factory.go:160" type="*v1.RuntimeClass" err="very short watch: k8s.io/client-go/informers/factory.go:160: Unexpected watch close - watch lasted less than a second and no items received" Apr 23 13:31:20.676274 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.676247 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.682246 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.682225 2576 certificate_manager.go:566] "Rotating certificates" logger="kubernetes.io/kubelet-serving" Apr 23 13:31:20.686453 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.686426 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kube-apiserver-client-kubelet" expiration="2028-04-22 13:26:19 +0000 UTC" deadline="2027-09-30 12:43:24.540794797 +0000 UTC" Apr 23 13:31:20.686525 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.686455 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kube-apiserver-client-kubelet" sleep="12599h12m3.854343743s" Apr 23 13:31:20.695433 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.695415 2576 reflector.go:430] "Caches populated" logger="kubernetes.io/kubelet-serving" type="*v1.CertificateSigningRequest" reflector="k8s.io/client-go/tools/watch/informerwatcher.go:162" Apr 23 13:31:20.712772 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.712752 2576 csr.go:274] "Certificate signing request is approved, waiting to be issued" logger="kubernetes.io/kubelet-serving" csr="csr-2ckt4" Apr 23 13:31:20.722930 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.722911 2576 csr.go:270] "Certificate signing request is issued" logger="kubernetes.io/kubelet-serving" csr="csr-2ckt4" Apr 23 13:31:20.750680 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:20.750655 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5447c4ab751e776bc417eb19d7fa6547.slice/crio-10ddac46ef0b4cb153b57137e043f1e54e7a97af143b73bc03dabe87b274c2c0 WatchSource:0}: Error finding container 10ddac46ef0b4cb153b57137e043f1e54e7a97af143b73bc03dabe87b274c2c0: Status 404 returned error can't find the container with id 10ddac46ef0b4cb153b57137e043f1e54e7a97af143b73bc03dabe87b274c2c0 Apr 23 13:31:20.751171 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:20.751148 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod5ea2aeb3a492354a2bfeec0a963ac187.slice/crio-f38b31d3c0d25b413e6cfd02512284a68301f86dafa4606b3cc848dec1d9cd89 WatchSource:0}: Error finding container f38b31d3c0d25b413e6cfd02512284a68301f86dafa4606b3cc848dec1d9cd89: Status 404 returned error can't find the container with id f38b31d3c0d25b413e6cfd02512284a68301f86dafa4606b3cc848dec1d9cd89 Apr 23 13:31:20.754872 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.754859 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:31:20.776737 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.776718 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.822805 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.822763 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" event={"ID":"5447c4ab751e776bc417eb19d7fa6547","Type":"ContainerStarted","Data":"10ddac46ef0b4cb153b57137e043f1e54e7a97af143b73bc03dabe87b274c2c0"} Apr 23 13:31:20.823644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.823624 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" event={"ID":"5ea2aeb3a492354a2bfeec0a963ac187","Type":"ContainerStarted","Data":"f38b31d3c0d25b413e6cfd02512284a68301f86dafa4606b3cc848dec1d9cd89"} Apr 23 13:31:20.876793 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.876769 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:20.913491 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:20.913439 2576 reflector.go:430] "Caches populated" type="*v1.Service" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:20.977549 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:20.977523 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:21.078051 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.078029 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:21.178888 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.178815 2576 kubelet_node_status.go:515] "Error getting the current node from lister" err="node \"ip-10-0-135-229.ec2.internal\" not found" Apr 23 13:31:21.197705 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.197682 2576 reflector.go:430] "Caches populated" type="*v1.Node" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:21.266248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.266222 2576 reflector.go:430] "Caches populated" type="*v1.CSIDriver" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:21.284452 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.284428 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" Apr 23 13:31:21.294215 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.294169 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:21.295602 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.295585 2576 kubelet.go:3340] "Creating a mirror pod for static pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" Apr 23 13:31:21.304959 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.304942 2576 warnings.go:110] "Warning: metadata.name: this is used in the Pod's hostname, which can result in surprising behavior; a DNS label is recommended: [must not contain dots]" Apr 23 13:31:21.661534 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.661458 2576 apiserver.go:52] "Watching apiserver" Apr 23 13:31:21.668755 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.668730 2576 reflector.go:430] "Caches populated" type="*v1.Pod" reflector="pkg/kubelet/config/apiserver.go:66" Apr 23 13:31:21.669109 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.669087 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-multus/multus-bltjj","openshift-multus/network-metrics-daemon-ms5b6","openshift-network-diagnostics/network-check-target-pzpnn","openshift-network-operator/iptables-alerter-pxr5c","kube-system/konnectivity-agent-wr4q4","kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal","openshift-dns/node-resolver-89qsg","openshift-image-registry/node-ca-7g8xg","openshift-multus/multus-additional-cni-plugins-b6xwg","openshift-ovn-kubernetes/ovnkube-node-6mgfk","openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5","openshift-cluster-node-tuning-operator/tuned-9wqrw","openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal"] Apr 23 13:31:21.671375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.671349 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.672585 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.672560 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:21.672677 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.672646 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:21.673735 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.673714 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.673865 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.673847 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.673986 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.673968 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"node-resolver-dockercfg-btbkw\"" Apr 23 13:31:21.675080 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.675061 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:21.675179 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.675127 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:21.676421 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.676403 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.677808 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.677745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.678723 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.678701 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.678822 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.678810 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.678938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.678921 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-script\"" Apr 23 13:31:21.679644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.679627 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-operator\"/\"iptables-alerter-dockercfg-zg2qg\"" Apr 23 13:31:21.680034 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.680018 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kube-system\"/\"konnectivity-ca-bundle\"" Apr 23 13:31:21.680111 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.680042 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.680111 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.680095 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"default-dockercfg-k28fs\"" Apr 23 13:31:21.680264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.680114 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.680264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.680019 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"konnectivity-agent\"" Apr 23 13:31:21.681448 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.681430 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.682861 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.682845 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.682953 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.682931 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"cni-copy-resources\"" Apr 23 13:31:21.683120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.683104 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"multus-daemon-config\"" Apr 23 13:31:21.683187 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.683159 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.683566 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.683550 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.683704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.683552 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"default-dockercfg-s6685\"" Apr 23 13:31:21.684130 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.684110 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"image-registry-certificates\"" Apr 23 13:31:21.684243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.684225 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"node-ca-dockercfg-6vjsf\"" Apr 23 13:31:21.684592 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.684373 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"whereabouts-flatfile-config\"" Apr 23 13:31:21.684592 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.684460 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-multus\"/\"default-cni-sysctl-allowlist\"" Apr 23 13:31:21.684861 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.684645 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.684943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.684876 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"multus-ancillary-tools-dockercfg-j987r\"" Apr 23 13:31:21.686136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.685439 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"env-overrides\"" Apr 23 13:31:21.686136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.685587 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-kubernetes-node-dockercfg-cs55k\"" Apr 23 13:31:21.686136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.685761 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-config\"" Apr 23 13:31:21.686136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.685802 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ovn-kubernetes\"/\"ovn-node-metrics-cert\"" Apr 23 13:31:21.686136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.686036 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.686572 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.686434 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.686572 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.686482 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.687482 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.687468 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"ovnkube-script-lib\"" Apr 23 13:31:21.687740 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.687725 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ovn-kubernetes\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.688002 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.687981 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.688799 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.688777 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.688906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.688888 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"tuned-dockercfg-tx9b2\"" Apr 23 13:31:21.689286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.689010 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.689286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.689016 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-node-tuning-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:31:21.689813 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.689770 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-sa-dockercfg-mrq52\"" Apr 23 13:31:21.689914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.689844 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-csi-drivers\"/\"aws-ebs-csi-driver-node-metrics-serving-cert\"" Apr 23 13:31:21.690390 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.690366 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-csi-drivers\"/\"kube-root-ca.crt\"" Apr 23 13:31:21.693577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693548 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-system-cni-dir\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.693662 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693581 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-registration-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.693662 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-system-cni-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.693662 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-cnibin\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.693662 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693644 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-cni-multus\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693680 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e3dce136-00f1-4f99-834a-c7f4d7ae44af-iptables-alerter-script\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693701 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693731 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693747 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693761 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e915a91-9dcb-4454-9ac4-0012727f6bdd-hosts-file\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693775 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-8rd6w\" (UniqueName: \"kubernetes.io/projected/8e915a91-9dcb-4454-9ac4-0012727f6bdd-kube-api-access-8rd6w\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693788 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-slash\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693809 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-ovn\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.693845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693831 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-ovnkube-config\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693853 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df3caba-2d71-4077-8a19-92dfab41c079-ovn-node-metrics-cert\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693912 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-socket-dir-parent\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693960 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-conf-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.693988 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fccbedc6-6cb0-47bb-8b72-95f91484d090-serviceca\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694025 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-device-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694048 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-systemd-units\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-systemd\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4t5dn\" (UniqueName: \"kubernetes.io/projected/10025a80-efb0-4838-a4b5-8e9ea110d4e1-kube-api-access-4t5dn\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694119 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3dce136-00f1-4f99-834a-c7f4d7ae44af-host-slash\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694176 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-ovnkube-script-lib\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694212 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-cni-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694236 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-os-release\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694274 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-kubelet\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694306 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-tw5kl\" (UniqueName: \"kubernetes.io/projected/3cb49d5f-4e57-4178-9753-b0d23608237e-kube-api-access-tw5kl\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694360 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-socket-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-c8h4q\" (UniqueName: \"kubernetes.io/projected/fccbedc6-6cb0-47bb-8b72-95f91484d090-kube-api-access-c8h4q\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694436 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694460 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e915a91-9dcb-4454-9ac4-0012727f6bdd-tmp-dir\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-cnibin\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694503 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-cni-bin\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-env-overrides\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694593 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-os-release\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694627 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10025a80-efb0-4838-a4b5-8e9ea110d4e1-cni-binary-copy\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fccbedc6-6cb0-47bb-8b72-95f91484d090-host\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694675 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/049aee3e-d268-4368-b645-787f7d1e1152-konnectivity-ca\") pod \"konnectivity-agent-wr4q4\" (UID: \"049aee3e-d268-4368-b645-787f7d1e1152\") " pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-run-netns\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694722 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-node-log\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.694945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694762 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-etc-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694783 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rk8wc\" (UniqueName: \"kubernetes.io/projected/e3dce136-00f1-4f99-834a-c7f4d7ae44af-kube-api-access-rk8wc\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694828 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-cni-bin\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694851 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vzzrt\" (UniqueName: \"kubernetes.io/projected/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-kube-api-access-vzzrt\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-k8s-cni-cncf-io\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694894 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-hostroot\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.694916 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-multus-certs\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695049 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-etc-selinux\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695071 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-kubelet\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695140 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-log-socket\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695170 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-daemon-config\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-etc-kubernetes\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/049aee3e-d268-4368-b645-787f7d1e1152-agent-certs\") pod \"konnectivity-agent-wr4q4\" (UID: \"049aee3e-d268-4368-b645-787f7d1e1152\") " pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.695587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695307 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-cni-netd\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.696164 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695344 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v7fwx\" (UniqueName: \"kubernetes.io/projected/8df3caba-2d71-4077-8a19-92dfab41c079-kube-api-access-v7fwx\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.696164 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695365 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-netns\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.696164 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cntl7\" (UniqueName: \"kubernetes.io/projected/58b92c1d-fc85-4d19-82d4-79f878c270ce-kube-api-access-cntl7\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.696164 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695414 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-sys-fs\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.696164 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.695434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-var-lib-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.723784 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.723658 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:20 +0000 UTC" deadline="2027-12-20 16:54:15.351553398 +0000 UTC" Apr 23 13:31:21.723784 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.723685 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14547h22m53.627871412s" Apr 23 13:31:21.784475 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.784457 2576 desired_state_of_world_populator.go:158] "Finished populating initial desired state of world" Apr 23 13:31:21.796241 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e915a91-9dcb-4454-9ac4-0012727f6bdd-hosts-file\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.796319 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796248 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-8rd6w\" (UniqueName: \"kubernetes.io/projected/8e915a91-9dcb-4454-9ac4-0012727f6bdd-kube-api-access-8rd6w\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.796319 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796271 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-slash\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.796447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796319 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-slash\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.796447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796358 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hosts-file\" (UniqueName: \"kubernetes.io/host-path/8e915a91-9dcb-4454-9ac4-0012727f6bdd-hosts-file\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.796447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796387 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-ovn\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.796447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796425 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-ovnkube-config\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796451 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df3caba-2d71-4077-8a19-92dfab41c079-ovn-node-metrics-cert\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-ovn\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-ovn\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796482 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wdbg6\" (UniqueName: \"kubernetes.io/projected/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-kube-api-access-wdbg6\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796547 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-socket-dir-parent\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796571 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-conf-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fccbedc6-6cb0-47bb-8b72-95f91484d090-serviceca\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796619 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-device-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.796650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796637 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-conf-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-conf-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796641 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-socket-dir-parent\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-socket-dir-parent\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796644 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-systemd-units\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796687 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-systemd\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"systemd-units\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-systemd-units\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796716 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4t5dn\" (UniqueName: \"kubernetes.io/projected/10025a80-efb0-4838-a4b5-8e9ea110d4e1-kube-api-access-4t5dn\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3dce136-00f1-4f99-834a-c7f4d7ae44af-host-slash\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796772 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"device-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-device-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796791 2576 swap_util.go:74] "error creating dir to test if tmpfs noswap is enabled. Assuming not supported" mount path="" error="stat /var/lib/kubelet/plugins/kubernetes.io/empty-dir: no such file or directory" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796961 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-config\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-ovnkube-config\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797018 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797004 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-systemd\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-systemd\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797037 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serviceca\" (UniqueName: \"kubernetes.io/configmap/fccbedc6-6cb0-47bb-8b72-95f91484d090-serviceca\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-slash\" (UniqueName: \"kubernetes.io/host-path/e3dce136-00f1-4f99-834a-c7f4d7ae44af-host-slash\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.796793 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-ovnkube-script-lib\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysconfig\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797165 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-cni-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797194 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-os-release\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797224 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-kubelet\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-tw5kl\" (UniqueName: \"kubernetes.io/projected/3cb49d5f-4e57-4178-9753-b0d23608237e-kube-api-access-tw5kl\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797248 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovnkube-script-lib\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-ovnkube-script-lib\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797255 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797260 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-cni-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797300 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-modprobe-d\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797340 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-kubernetes\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797344 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-os-release\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797351 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-networks-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-var-lib-cni-networks-ovn-kubernetes\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797300 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-kubelet\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.797449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797381 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-socket-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-tmp\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797435 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-c8h4q\" (UniqueName: \"kubernetes.io/projected/fccbedc6-6cb0-47bb-8b72-95f91484d090-kube-api-access-c8h4q\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797507 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"socket-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-socket-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797540 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e915a91-9dcb-4454-9ac4-0012727f6bdd-tmp-dir\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797584 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-host\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797621 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-cnibin\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797677 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-cnibin\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797729 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-cni-bin\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-env-overrides\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797840 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-kubelet-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797879 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-var-lib-kubelet\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797883 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/8e915a91-9dcb-4454-9ac4-0012727f6bdd-tmp-dir\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797908 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-os-release\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797938 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-bin\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-cni-bin\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.798054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.797976 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10025a80-efb0-4838-a4b5-8e9ea110d4e1-cni-binary-copy\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-sysctl-allowlist\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-cni-sysctl-allowlist\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798025 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fccbedc6-6cb0-47bb-8b72-95f91484d090-host\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798052 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"os-release\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-os-release\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798116 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/049aee3e-d268-4368-b645-787f7d1e1152-konnectivity-ca\") pod \"konnectivity-agent-wr4q4\" (UID: \"049aee3e-d268-4368-b645-787f7d1e1152\") " pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/fccbedc6-6cb0-47bb-8b72-95f91484d090-host\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798178 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-run-netns\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798204 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"env-overrides\" (UniqueName: \"kubernetes.io/configmap/8df3caba-2d71-4077-8a19-92dfab41c079-env-overrides\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798251 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-node-log\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-run-netns\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798355 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798388 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-log\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-node-log\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798449 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/10025a80-efb0-4838-a4b5-8e9ea110d4e1-cni-binary-copy\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798454 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-etc-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-etc-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"whereabouts-flatfile-configmap\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-whereabouts-flatfile-configmap\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798523 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-systemd\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.798629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798569 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-tuned\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798616 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rk8wc\" (UniqueName: \"kubernetes.io/projected/e3dce136-00f1-4f99-834a-c7f4d7ae44af-kube-api-access-rk8wc\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798658 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-cni-bin\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vzzrt\" (UniqueName: \"kubernetes.io/projected/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-kube-api-access-vzzrt\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798707 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysctl-conf\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798657 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"konnectivity-ca\" (UniqueName: \"kubernetes.io/configmap/049aee3e-d268-4368-b645-787f7d1e1152-konnectivity-ca\") pod \"konnectivity-agent-wr4q4\" (UID: \"049aee3e-d268-4368-b645-787f7d1e1152\") " pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798732 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-k8s-cni-cncf-io\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tuning-conf-dir\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-tuning-conf-dir\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798756 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-hostroot\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798783 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cni-binary-copy\" (UniqueName: \"kubernetes.io/configmap/58b92c1d-fc85-4d19-82d4-79f878c270ce-cni-binary-copy\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"hostroot\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-hostroot\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798833 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-k8s-cni-cncf-io\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-k8s-cni-cncf-io\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-bin\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-cni-bin\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798881 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-multus-certs\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798920 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-etc-selinux\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798939 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-multus-certs\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-multus-certs\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.799397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798941 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-kubelet\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798964 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-log-socket\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798970 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-kubelet\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-kubelet\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.798980 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799005 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"log-socket\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-log-socket\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysctl-d\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-selinux\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-etc-selinux\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-daemon-config\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-etc-kubernetes\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799090 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/049aee3e-d268-4368-b645-787f7d1e1152-agent-certs\") pod \"konnectivity-agent-wr4q4\" (UID: \"049aee3e-d268-4368-b645-787f7d1e1152\") " pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-cni-netd\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799138 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v7fwx\" (UniqueName: \"kubernetes.io/projected/8df3caba-2d71-4077-8a19-92dfab41c079-kube-api-access-v7fwx\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-run\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799186 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-netns\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799208 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-cni-netd\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-cni-netd\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799427 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-netns\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-run-netns\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"multus-daemon-config\" (UniqueName: \"kubernetes.io/configmap/10025a80-efb0-4838-a4b5-8e9ea110d4e1-multus-daemon-config\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799528 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-etc-kubernetes\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.800238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799211 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cntl7\" (UniqueName: \"kubernetes.io/projected/58b92c1d-fc85-4d19-82d4-79f878c270ce-kube-api-access-cntl7\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799577 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-sys-fs\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-var-lib-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799641 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-system-cni-dir\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799648 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys-fs\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-sys-fs\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799669 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-registration-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799697 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-lib-modules\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-var-lib-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-system-cni-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799744 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-cnibin\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799765 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-cni-multus\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-var-lib-cni-multus\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-host-var-lib-cni-multus\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-system-cni-dir\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799835 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registration-dir\" (UniqueName: \"kubernetes.io/host-path/3cb49d5f-4e57-4178-9753-b0d23608237e-registration-dir\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799852 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cnibin\" (UniqueName: \"kubernetes.io/host-path/10025a80-efb0-4838-a4b5-8e9ea110d4e1-cnibin\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799856 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e3dce136-00f1-4f99-834a-c7f4d7ae44af-iptables-alerter-script\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799879 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799878 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"system-cni-dir\" (UniqueName: \"kubernetes.io/host-path/58b92c1d-fc85-4d19-82d4-79f878c270ce-system-cni-dir\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799894 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799909 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run-openvswitch\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-run-openvswitch\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799926 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-sys\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.799952 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host-run-ovn-kubernetes\" (UniqueName: \"kubernetes.io/host-path/8df3caba-2d71-4077-8a19-92dfab41c079-host-run-ovn-kubernetes\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.800051 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.800072 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ovn-node-metrics-cert\" (UniqueName: \"kubernetes.io/secret/8df3caba-2d71-4077-8a19-92dfab41c079-ovn-node-metrics-cert\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.800127 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:31:22.300089361 +0000 UTC m=+3.048091308 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.800284 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"iptables-alerter-script\" (UniqueName: \"kubernetes.io/configmap/e3dce136-00f1-4f99-834a-c7f4d7ae44af-iptables-alerter-script\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.801773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.801763 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"agent-certs\" (UniqueName: \"kubernetes.io/secret/049aee3e-d268-4368-b645-787f7d1e1152-agent-certs\") pod \"konnectivity-agent-wr4q4\" (UID: \"049aee3e-d268-4368-b645-787f7d1e1152\") " pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:21.804574 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.804555 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:21.804732 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.804585 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:21.804732 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.804607 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4qksw for pod openshift-network-diagnostics/network-check-target-pzpnn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:21.804732 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:21.804670 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw podName:2a7db36c-22a7-4fed-ba55-f60113c7ad0c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:22.30464681 +0000 UTC m=+3.052648743 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "kube-api-access-4qksw" (UniqueName: "kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw") pod "network-check-target-pzpnn" (UID: "2a7db36c-22a7-4fed-ba55-f60113c7ad0c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:21.807603 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.807237 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4t5dn\" (UniqueName: \"kubernetes.io/projected/10025a80-efb0-4838-a4b5-8e9ea110d4e1-kube-api-access-4t5dn\") pod \"multus-bltjj\" (UID: \"10025a80-efb0-4838-a4b5-8e9ea110d4e1\") " pod="openshift-multus/multus-bltjj" Apr 23 13:31:21.807603 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.807515 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-8rd6w\" (UniqueName: \"kubernetes.io/projected/8e915a91-9dcb-4454-9ac4-0012727f6bdd-kube-api-access-8rd6w\") pod \"node-resolver-89qsg\" (UID: \"8e915a91-9dcb-4454-9ac4-0012727f6bdd\") " pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.808525 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.808477 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rk8wc\" (UniqueName: \"kubernetes.io/projected/e3dce136-00f1-4f99-834a-c7f4d7ae44af-kube-api-access-rk8wc\") pod \"iptables-alerter-pxr5c\" (UID: \"e3dce136-00f1-4f99-834a-c7f4d7ae44af\") " pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.808678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.808647 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-c8h4q\" (UniqueName: \"kubernetes.io/projected/fccbedc6-6cb0-47bb-8b72-95f91484d090-kube-api-access-c8h4q\") pod \"node-ca-7g8xg\" (UID: \"fccbedc6-6cb0-47bb-8b72-95f91484d090\") " pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:21.808678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.808670 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-tw5kl\" (UniqueName: \"kubernetes.io/projected/3cb49d5f-4e57-4178-9753-b0d23608237e-kube-api-access-tw5kl\") pod \"aws-ebs-csi-driver-node-d9cn5\" (UID: \"3cb49d5f-4e57-4178-9753-b0d23608237e\") " pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:21.808989 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.808958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v7fwx\" (UniqueName: \"kubernetes.io/projected/8df3caba-2d71-4077-8a19-92dfab41c079-kube-api-access-v7fwx\") pod \"ovnkube-node-6mgfk\" (UID: \"8df3caba-2d71-4077-8a19-92dfab41c079\") " pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:21.809461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.809437 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vzzrt\" (UniqueName: \"kubernetes.io/projected/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-kube-api-access-vzzrt\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:21.810547 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.810526 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cntl7\" (UniqueName: \"kubernetes.io/projected/58b92c1d-fc85-4d19-82d4-79f878c270ce-kube-api-access-cntl7\") pod \"multus-additional-cni-plugins-b6xwg\" (UID: \"58b92c1d-fc85-4d19-82d4-79f878c270ce\") " pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:21.849653 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.849637 2576 reflector.go:430] "Caches populated" type="*v1.RuntimeClass" reflector="k8s.io/client-go/informers/factory.go:160" Apr 23 13:31:21.900867 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900848 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-lib-modules\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.900943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-sys\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.900943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wdbg6\" (UniqueName: \"kubernetes.io/projected/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-kube-api-access-wdbg6\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.900943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900921 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysconfig\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.900943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-modprobe-d\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900952 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-kubernetes\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900966 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-sys\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.900975 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-tmp\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901001 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysconfig\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysconfig\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901010 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-host\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901024 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-lib-modules\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"host\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-host\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-modprobe-d\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-modprobe-d\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901048 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-var-lib-kubelet\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901055 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-kubernetes\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-kubernetes\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901090 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"var-lib-kubelet\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-var-lib-kubelet\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-systemd\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901133 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901142 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-tuned\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901168 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysctl-conf\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-systemd\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-systemd\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysctl-d\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901240 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-run\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901299 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"run\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-run\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901314 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-conf\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysctl-conf\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.901617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.901360 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-sysctl-d\" (UniqueName: \"kubernetes.io/host-path/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-sysctl-d\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.903005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.902989 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp\" (UniqueName: \"kubernetes.io/empty-dir/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-tmp\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.903070 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.903021 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"etc-tuned\" (UniqueName: \"kubernetes.io/empty-dir/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-etc-tuned\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.908563 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.908546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wdbg6\" (UniqueName: \"kubernetes.io/projected/9a82984f-1f3f-4b3e-9bac-becc5338c0a3-kube-api-access-wdbg6\") pod \"tuned-9wqrw\" (UID: \"9a82984f-1f3f-4b3e-9bac-becc5338c0a3\") " pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:21.982618 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.982487 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/node-resolver-89qsg" Apr 23 13:31:21.991019 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:21.990996 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8e915a91_9dcb_4454_9ac4_0012727f6bdd.slice/crio-a6187955da9ca50bfc21d12ff6cbf6826a3b3af4624c46052674bdec412fabd6 WatchSource:0}: Error finding container a6187955da9ca50bfc21d12ff6cbf6826a3b3af4624c46052674bdec412fabd6: Status 404 returned error can't find the container with id a6187955da9ca50bfc21d12ff6cbf6826a3b3af4624c46052674bdec412fabd6 Apr 23 13:31:21.992008 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:21.991990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-operator/iptables-alerter-pxr5c" Apr 23 13:31:21.999190 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:21.999170 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode3dce136_00f1_4f99_834a_c7f4d7ae44af.slice/crio-490e9ef77fa459c05708b6c6782e5ab07d77f050bd8edcb43c0824099ed8b300 WatchSource:0}: Error finding container 490e9ef77fa459c05708b6c6782e5ab07d77f050bd8edcb43c0824099ed8b300: Status 404 returned error can't find the container with id 490e9ef77fa459c05708b6c6782e5ab07d77f050bd8edcb43c0824099ed8b300 Apr 23 13:31:22.002290 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.002253 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:22.007318 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.007136 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:22.010847 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:22.010826 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8df3caba_2d71_4077_8a19_92dfab41c079.slice/crio-d6874a6d2db2fae76a2ee3a3f98eb0ff75faf6935dd4be1b5731a1e2d30df8ab WatchSource:0}: Error finding container d6874a6d2db2fae76a2ee3a3f98eb0ff75faf6935dd4be1b5731a1e2d30df8ab: Status 404 returned error can't find the container with id d6874a6d2db2fae76a2ee3a3f98eb0ff75faf6935dd4be1b5731a1e2d30df8ab Apr 23 13:31:22.013852 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.013831 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-bltjj" Apr 23 13:31:22.015059 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:22.015036 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod049aee3e_d268_4368_b645_787f7d1e1152.slice/crio-4f498ba0fd8c34fe6997c585d3a6f49db0e4c5ed5dbca6367f4d23eac63ccb91 WatchSource:0}: Error finding container 4f498ba0fd8c34fe6997c585d3a6f49db0e4c5ed5dbca6367f4d23eac63ccb91: Status 404 returned error can't find the container with id 4f498ba0fd8c34fe6997c585d3a6f49db0e4c5ed5dbca6367f4d23eac63ccb91 Apr 23 13:31:22.020543 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.020451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/node-ca-7g8xg" Apr 23 13:31:22.021634 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:22.021508 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod10025a80_efb0_4838_a4b5_8e9ea110d4e1.slice/crio-236306a5401b017e6c700f56fe44a3c223054235383af2f413197173903ad59b WatchSource:0}: Error finding container 236306a5401b017e6c700f56fe44a3c223054235383af2f413197173903ad59b: Status 404 returned error can't find the container with id 236306a5401b017e6c700f56fe44a3c223054235383af2f413197173903ad59b Apr 23 13:31:22.027695 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.027677 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" Apr 23 13:31:22.028802 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:22.028548 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfccbedc6_6cb0_47bb_8b72_95f91484d090.slice/crio-d2c9e4025a8c6989e84143ac818cea1d07dcfe64ff4c342fbe215a294dba2465 WatchSource:0}: Error finding container d2c9e4025a8c6989e84143ac818cea1d07dcfe64ff4c342fbe215a294dba2465: Status 404 returned error can't find the container with id d2c9e4025a8c6989e84143ac818cea1d07dcfe64ff4c342fbe215a294dba2465 Apr 23 13:31:22.033699 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.033268 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" Apr 23 13:31:22.037608 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.037405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" Apr 23 13:31:22.046888 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:22.046865 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3cb49d5f_4e57_4178_9753_b0d23608237e.slice/crio-01aabce4d1dd16bba6af10af82b8c1fcd2ac34f3186376b015c0af4465a95f69 WatchSource:0}: Error finding container 01aabce4d1dd16bba6af10af82b8c1fcd2ac34f3186376b015c0af4465a95f69: Status 404 returned error can't find the container with id 01aabce4d1dd16bba6af10af82b8c1fcd2ac34f3186376b015c0af4465a95f69 Apr 23 13:31:22.304891 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.304811 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:22.304891 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.304861 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:22.305097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.304932 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:22.305097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.304953 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:22.305097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.305005 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:31:23.304988529 +0000 UTC m=+4.052990478 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:22.305097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.304956 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:22.305097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.305041 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4qksw for pod openshift-network-diagnostics/network-check-target-pzpnn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:22.305097 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.305092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw podName:2a7db36c-22a7-4fed-ba55-f60113c7ad0c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:23.305076286 +0000 UTC m=+4.053078223 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "kube-api-access-4qksw" (UniqueName: "kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw") pod "network-check-target-pzpnn" (UID: "2a7db36c-22a7-4fed-ba55-f60113c7ad0c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:22.724527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.724415 2576 certificate_manager.go:715] "Certificate rotation deadline determined" logger="kubernetes.io/kubelet-serving" expiration="2028-04-22 13:26:20 +0000 UTC" deadline="2027-12-23 20:49:45.481453699 +0000 UTC" Apr 23 13:31:22.724527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.724453 2576 certificate_manager.go:431] "Waiting for next certificate rotation" logger="kubernetes.io/kubelet-serving" sleep="14623h18m22.757006236s" Apr 23 13:31:22.821310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.821275 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:22.821484 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.821460 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:22.825523 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.825496 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:22.825712 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:22.825647 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:22.829059 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.828783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" event={"ID":"3cb49d5f-4e57-4178-9753-b0d23608237e","Type":"ContainerStarted","Data":"01aabce4d1dd16bba6af10af82b8c1fcd2ac34f3186376b015c0af4465a95f69"} Apr 23 13:31:22.830898 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.830874 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerStarted","Data":"73a4769934a5a77428f3064dec75ee19b10ac4048caf6e55bfef64cb4f66d1ed"} Apr 23 13:31:22.831914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.831895 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"d6874a6d2db2fae76a2ee3a3f98eb0ff75faf6935dd4be1b5731a1e2d30df8ab"} Apr 23 13:31:22.833359 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.833322 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pxr5c" event={"ID":"e3dce136-00f1-4f99-834a-c7f4d7ae44af","Type":"ContainerStarted","Data":"490e9ef77fa459c05708b6c6782e5ab07d77f050bd8edcb43c0824099ed8b300"} Apr 23 13:31:22.834910 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.834832 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-89qsg" event={"ID":"8e915a91-9dcb-4454-9ac4-0012727f6bdd","Type":"ContainerStarted","Data":"a6187955da9ca50bfc21d12ff6cbf6826a3b3af4624c46052674bdec412fabd6"} Apr 23 13:31:22.839125 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.839102 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" event={"ID":"9a82984f-1f3f-4b3e-9bac-becc5338c0a3","Type":"ContainerStarted","Data":"32c2eb6dc3b2e90d0cf9bb4eafd5397a919887309c241edfcbdc35f5adf4faa7"} Apr 23 13:31:22.840579 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.840558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7g8xg" event={"ID":"fccbedc6-6cb0-47bb-8b72-95f91484d090","Type":"ContainerStarted","Data":"d2c9e4025a8c6989e84143ac818cea1d07dcfe64ff4c342fbe215a294dba2465"} Apr 23 13:31:22.841817 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.841795 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bltjj" event={"ID":"10025a80-efb0-4838-a4b5-8e9ea110d4e1","Type":"ContainerStarted","Data":"236306a5401b017e6c700f56fe44a3c223054235383af2f413197173903ad59b"} Apr 23 13:31:22.842741 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.842717 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wr4q4" event={"ID":"049aee3e-d268-4368-b645-787f7d1e1152","Type":"ContainerStarted","Data":"4f498ba0fd8c34fe6997c585d3a6f49db0e4c5ed5dbca6367f4d23eac63ccb91"} Apr 23 13:31:22.845363 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:22.845225 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" event={"ID":"5447c4ab751e776bc417eb19d7fa6547","Type":"ContainerStarted","Data":"497e369fa166bf9291dac3ca28b793236b0b1b3b9b4f52b93cb7882f2ec27ad1"} Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:23.312672 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:23.312740 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:23.312856 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:23.312920 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:31:25.312901188 +0000 UTC m=+6.060903125 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:23.313341 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:23.313362 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:23.313375 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4qksw for pod openshift-network-diagnostics/network-check-target-pzpnn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:23.313457 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:23.313420 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw podName:2a7db36c-22a7-4fed-ba55-f60113c7ad0c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:25.313405529 +0000 UTC m=+6.061407465 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "kube-api-access-4qksw" (UniqueName: "kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw") pod "network-check-target-pzpnn" (UID: "2a7db36c-22a7-4fed-ba55-f60113c7ad0c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:23.853644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:23.852754 2576 generic.go:358] "Generic (PLEG): container finished" podID="5447c4ab751e776bc417eb19d7fa6547" containerID="497e369fa166bf9291dac3ca28b793236b0b1b3b9b4f52b93cb7882f2ec27ad1" exitCode=0 Apr 23 13:31:23.853644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:23.852803 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" event={"ID":"5447c4ab751e776bc417eb19d7fa6547","Type":"ContainerDied","Data":"497e369fa166bf9291dac3ca28b793236b0b1b3b9b4f52b93cb7882f2ec27ad1"} Apr 23 13:31:24.820775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:24.820742 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:24.820960 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:24.820861 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:24.820960 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:24.820948 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:24.821083 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:24.821066 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:25.329991 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:25.329956 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:25.330466 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:25.330015 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:25.330466 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:25.330155 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:25.330466 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:25.330217 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:31:29.330197898 +0000 UTC m=+10.078199837 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:25.330804 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:25.330644 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:25.330804 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:25.330665 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:25.330804 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:25.330680 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4qksw for pod openshift-network-diagnostics/network-check-target-pzpnn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:25.330804 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:25.330725 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw podName:2a7db36c-22a7-4fed-ba55-f60113c7ad0c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:29.330710318 +0000 UTC m=+10.078712256 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "kube-api-access-4qksw" (UniqueName: "kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw") pod "network-check-target-pzpnn" (UID: "2a7db36c-22a7-4fed-ba55-f60113c7ad0c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:26.820889 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:26.820854 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:26.821355 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:26.820981 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:26.821433 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:26.820855 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:26.821516 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:26.821486 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:28.821233 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:28.821196 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:28.821683 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:28.821198 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:28.821683 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:28.821318 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:28.821683 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:28.821424 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:29.363075 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:29.363039 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:29.363267 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:29.363129 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:29.363267 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:29.363210 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:29.363267 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:29.363254 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:29.363447 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:29.363273 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:29.363447 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:29.363286 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4qksw for pod openshift-network-diagnostics/network-check-target-pzpnn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:29.363447 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:29.363288 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:31:37.363264951 +0000 UTC m=+18.111266885 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:29.363447 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:29.363349 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw podName:2a7db36c-22a7-4fed-ba55-f60113c7ad0c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:37.363319711 +0000 UTC m=+18.111321661 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "kube-api-access-4qksw" (UniqueName: "kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw") pod "network-check-target-pzpnn" (UID: "2a7db36c-22a7-4fed-ba55-f60113c7ad0c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:30.820432 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:30.820404 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:30.820802 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:30.820418 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:30.820802 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:30.820515 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:30.820802 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:30.820618 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:32.820826 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:32.820788 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:32.821237 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:32.820792 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:32.821237 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:32.820896 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:32.821237 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:32.821006 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:34.820317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:34.820286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:34.820699 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:34.820286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:34.820699 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:34.820418 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:34.820699 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:34.820536 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:36.820409 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:36.820376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:36.820953 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:36.820376 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:36.820953 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:36.820489 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:36.820953 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:36.820596 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:37.427179 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:37.427145 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:37.427204 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:37.427304 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:37.427346 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/kube-root-ca.crt: object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:37.427370 2576 projected.go:289] Couldn't get configMap openshift-network-diagnostics/openshift-service-ca.crt: object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:37.427385 2576 projected.go:194] Error preparing data for projected volume kube-api-access-4qksw for pod openshift-network-diagnostics/network-check-target-pzpnn: [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:37.427390 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:31:53.42736007 +0000 UTC m=+34.175362021 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : object "openshift-multus"/"metrics-daemon-secret" not registered Apr 23 13:31:37.427480 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:37.427435 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw podName:2a7db36c-22a7-4fed-ba55-f60113c7ad0c nodeName:}" failed. No retries permitted until 2026-04-23 13:31:53.427419494 +0000 UTC m=+34.175421443 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "kube-api-access-4qksw" (UniqueName: "kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw") pod "network-check-target-pzpnn" (UID: "2a7db36c-22a7-4fed-ba55-f60113c7ad0c") : [object "openshift-network-diagnostics"/"kube-root-ca.crt" not registered, object "openshift-network-diagnostics"/"openshift-service-ca.crt" not registered] Apr 23 13:31:38.820903 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.820675 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:38.821622 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:38.821003 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:38.821622 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.821115 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:38.821622 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:38.821199 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:38.878380 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.877970 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" event={"ID":"5ea2aeb3a492354a2bfeec0a963ac187","Type":"ContainerStarted","Data":"44b162397001b0b4e2554f91a42238b55a70a596610eca2ce744491262ef0086"} Apr 23 13:31:38.885048 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.884988 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" event={"ID":"9a82984f-1f3f-4b3e-9bac-becc5338c0a3","Type":"ContainerStarted","Data":"1956701efcde69121456377dd988d2396ab66fa403741c64c884456640db6ef9"} Apr 23 13:31:38.888043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.888005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-bltjj" event={"ID":"10025a80-efb0-4838-a4b5-8e9ea110d4e1","Type":"ContainerStarted","Data":"d3d1229da22c944ff1d69c5e14a5c890ef413e37fe15bb35da10e5bc538409e4"} Apr 23 13:31:38.891491 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.891468 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/konnectivity-agent-wr4q4" event={"ID":"049aee3e-d268-4368-b645-787f7d1e1152","Type":"ContainerStarted","Data":"f9f0fca08aea8e3c3835160a1b1155fe8690107b6ac13044ce5ec253ade9af33"} Apr 23 13:31:38.894992 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.894968 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" event={"ID":"5447c4ab751e776bc417eb19d7fa6547","Type":"ContainerStarted","Data":"9e625ae091e07d79874b01fd6d8cbc08d20055d6584a5798e3d830f73af919e9"} Apr 23 13:31:38.895469 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.895253 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/kube-apiserver-proxy-ip-10-0-135-229.ec2.internal" podStartSLOduration=17.895240754 podStartE2EDuration="17.895240754s" podCreationTimestamp="2026-04-23 13:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:38.894942583 +0000 UTC m=+19.642944542" watchObservedRunningTime="2026-04-23 13:31:38.895240754 +0000 UTC m=+19.643242709" Apr 23 13:31:38.897148 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.897115 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" event={"ID":"3cb49d5f-4e57-4178-9753-b0d23608237e","Type":"ContainerStarted","Data":"76c104a7cc3b424fcff3f816d0a703c62011d64341c2d4af85b4169441486090"} Apr 23 13:31:38.899308 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.899014 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerStarted","Data":"4ff78dd821f66bdb088f29faa8e14e2167f1b938107535e51142019f1f332c72"} Apr 23 13:31:38.902512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.902494 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"3dc526e093a691c83b81f835800970561d40c727d20e321ca488d37629a46413"} Apr 23 13:31:38.902593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.902516 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"7d27d32ef8ceefc1fcc3180d4508f2a6d8a5d49ebc9ee8731d90400a5527d515"} Apr 23 13:31:38.911032 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.910997 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/konnectivity-agent-wr4q4" podStartSLOduration=7.877932487 podStartE2EDuration="19.910983088s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.017406982 +0000 UTC m=+2.765408918" lastFinishedPulling="2026-04-23 13:31:34.050457586 +0000 UTC m=+14.798459519" observedRunningTime="2026-04-23 13:31:38.91066232 +0000 UTC m=+19.658664275" watchObservedRunningTime="2026-04-23 13:31:38.910983088 +0000 UTC m=+19.658985043" Apr 23 13:31:38.926724 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.926683 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-bltjj" podStartSLOduration=3.373535772 podStartE2EDuration="19.926672996s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.022717717 +0000 UTC m=+2.770719654" lastFinishedPulling="2026-04-23 13:31:38.575854933 +0000 UTC m=+19.323856878" observedRunningTime="2026-04-23 13:31:38.926554677 +0000 UTC m=+19.674556632" watchObservedRunningTime="2026-04-23 13:31:38.926672996 +0000 UTC m=+19.674674951" Apr 23 13:31:38.943290 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.943065 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-node-tuning-operator/tuned-9wqrw" podStartSLOduration=3.791552851 podStartE2EDuration="19.9430492s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.045102566 +0000 UTC m=+2.793104502" lastFinishedPulling="2026-04-23 13:31:38.196598913 +0000 UTC m=+18.944600851" observedRunningTime="2026-04-23 13:31:38.942219414 +0000 UTC m=+19.690221384" watchObservedRunningTime="2026-04-23 13:31:38.9430492 +0000 UTC m=+19.691051155" Apr 23 13:31:38.956554 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:38.956518 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-machine-config-operator/kube-rbac-proxy-crio-ip-10-0-135-229.ec2.internal" podStartSLOduration=17.956504994 podStartE2EDuration="17.956504994s" podCreationTimestamp="2026-04-23 13:31:21 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:31:38.956051763 +0000 UTC m=+19.704053718" watchObservedRunningTime="2026-04-23 13:31:38.956504994 +0000 UTC m=+19.704506948" Apr 23 13:31:39.753573 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.753437 2576 plugin_watcher.go:194] "Adding socket path or updating timestamp to desired state cache" path="/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock" Apr 23 13:31:39.904850 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.904770 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-operator/iptables-alerter-pxr5c" event={"ID":"e3dce136-00f1-4f99-834a-c7f4d7ae44af","Type":"ContainerStarted","Data":"ee3e318ab811c6b7f61215490f3a6ebe32103965a20ff8a6034efb30ac596e20"} Apr 23 13:31:39.906025 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.905999 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/node-resolver-89qsg" event={"ID":"8e915a91-9dcb-4454-9ac4-0012727f6bdd","Type":"ContainerStarted","Data":"a53b7e91f0ee0e1ae1d30f505fa4e9703bc1f2e638f87d5db9cc73da517a18e1"} Apr 23 13:31:39.907175 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.907155 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/node-ca-7g8xg" event={"ID":"fccbedc6-6cb0-47bb-8b72-95f91484d090","Type":"ContainerStarted","Data":"f4bed2dfd0299704d35a644c6d221814a63a2cde58bba6e283c36fea5a574ba8"} Apr 23 13:31:39.908630 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.908604 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" event={"ID":"3cb49d5f-4e57-4178-9753-b0d23608237e","Type":"ContainerStarted","Data":"2d5ce2a401d976ff96857031659e10d5bec84e5fabaad0ca7fbcf654dd45749f"} Apr 23 13:31:39.909641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.909622 2576 generic.go:358] "Generic (PLEG): container finished" podID="58b92c1d-fc85-4d19-82d4-79f878c270ce" containerID="4ff78dd821f66bdb088f29faa8e14e2167f1b938107535e51142019f1f332c72" exitCode=0 Apr 23 13:31:39.909725 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.909672 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerDied","Data":"4ff78dd821f66bdb088f29faa8e14e2167f1b938107535e51142019f1f332c72"} Apr 23 13:31:39.912034 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.912017 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"1a0658b6941e8632aec8305ba8cff51ca6e2dfba2d8d99ae8f8f9f37bc1efe54"} Apr 23 13:31:39.912110 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.912039 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"63a1a7d5e22fa3103ea6523a919089a5db3ce00539dad03918f24b2daef4989c"} Apr 23 13:31:39.912110 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.912050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"3f2b517ec0800aa9fc93a587f766aef81a541f8051f1ee2c2f1c9a3540480738"} Apr 23 13:31:39.912110 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.912063 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"aab715cd2bb178b54967eacc924103ac9aea6585c99be2be1a8c5746760349b0"} Apr 23 13:31:39.919986 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.919953 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-operator/iptables-alerter-pxr5c" podStartSLOduration=4.367389501 podStartE2EDuration="20.919943434s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.000719682 +0000 UTC m=+2.748721633" lastFinishedPulling="2026-04-23 13:31:38.553273617 +0000 UTC m=+19.301275566" observedRunningTime="2026-04-23 13:31:39.919755344 +0000 UTC m=+20.667757309" watchObservedRunningTime="2026-04-23 13:31:39.919943434 +0000 UTC m=+20.667945389" Apr 23 13:31:39.953446 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:39.953402 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/node-resolver-89qsg" podStartSLOduration=4.440246102 podStartE2EDuration="20.953390677s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:21.992869192 +0000 UTC m=+2.740871128" lastFinishedPulling="2026-04-23 13:31:38.506013761 +0000 UTC m=+19.254015703" observedRunningTime="2026-04-23 13:31:39.953315703 +0000 UTC m=+20.701317658" watchObservedRunningTime="2026-04-23 13:31:39.953390677 +0000 UTC m=+20.701392631" Apr 23 13:31:40.028344 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.028307 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:40.753162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.753031 2576 reconciler.go:161] "OperationExecutor.RegisterPlugin started" plugin={"SocketPath":"/var/lib/kubelet/plugins_registry/ebs.csi.aws.com-reg.sock","Timestamp":"2026-04-23T13:31:39.753568846Z","UUID":"19de4bfd-e41b-41f6-a67f-447214ce83a4","Handler":null,"Name":"","Endpoint":""} Apr 23 13:31:40.754817 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.754798 2576 csi_plugin.go:106] kubernetes.io/csi: Trying to validate a new CSI Driver with name: ebs.csi.aws.com endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock versions: 1.0.0 Apr 23 13:31:40.754817 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.754822 2576 csi_plugin.go:119] kubernetes.io/csi: Register new plugin with name: ebs.csi.aws.com at endpoint: /var/lib/kubelet/plugins/ebs.csi.aws.com/csi.sock Apr 23 13:31:40.821121 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.821092 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:40.821251 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.821161 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:40.821315 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:40.821261 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:40.821403 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:40.821385 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:40.915951 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.915919 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" event={"ID":"3cb49d5f-4e57-4178-9753-b0d23608237e","Type":"ContainerStarted","Data":"1a215e9f63f0f49e70e3a2d2246fbd3bf4354b3da4f930766de929638fdbea36"} Apr 23 13:31:40.936519 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.936478 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/node-ca-7g8xg" podStartSLOduration=5.462797301 podStartE2EDuration="21.936466045s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.03256696 +0000 UTC m=+2.780568899" lastFinishedPulling="2026-04-23 13:31:38.506235709 +0000 UTC m=+19.254237643" observedRunningTime="2026-04-23 13:31:39.96968073 +0000 UTC m=+20.717682695" watchObservedRunningTime="2026-04-23 13:31:40.936466045 +0000 UTC m=+21.684467999" Apr 23 13:31:40.936859 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:40.936830 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-csi-drivers/aws-ebs-csi-driver-node-d9cn5" podStartSLOduration=3.51406602 podStartE2EDuration="21.936820369s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.049236662 +0000 UTC m=+2.797238599" lastFinishedPulling="2026-04-23 13:31:40.471991002 +0000 UTC m=+21.219992948" observedRunningTime="2026-04-23 13:31:40.936360621 +0000 UTC m=+21.684362574" watchObservedRunningTime="2026-04-23 13:31:40.936820369 +0000 UTC m=+21.684822326" Apr 23 13:31:41.921168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:41.921125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"901f2b82bce86354d2ebcdf4fd475a78be553ef6ab8e116b2a28468d3eb8d717"} Apr 23 13:31:42.820476 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:42.820440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:42.820608 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:42.820440 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:42.820608 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:42.820557 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:42.820697 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:42.820607 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:43.388730 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:43.388699 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:43.389500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:43.389479 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:43.925612 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:43.925586 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kube-system/konnectivity-agent-wr4q4" Apr 23 13:31:44.821270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.821105 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:44.821675 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.821106 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:44.821675 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:44.821357 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:44.821675 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:44.821434 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:44.927943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.927919 2576 generic.go:358] "Generic (PLEG): container finished" podID="58b92c1d-fc85-4d19-82d4-79f878c270ce" containerID="b9e68389678149d8b10987f0571db025a1cf20d5d3dc4d791443f2585a5eb5b9" exitCode=0 Apr 23 13:31:44.928082 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.928009 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerDied","Data":"b9e68389678149d8b10987f0571db025a1cf20d5d3dc4d791443f2585a5eb5b9"} Apr 23 13:31:44.931063 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.931043 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" event={"ID":"8df3caba-2d71-4077-8a19-92dfab41c079","Type":"ContainerStarted","Data":"089d8eb1d511fa4082c59056f29e0530125d0bee596380fa38b9dc751925b569"} Apr 23 13:31:44.931449 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.931426 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:44.931527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.931455 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:44.946809 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.946790 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:44.981680 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:44.981642 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" podStartSLOduration=9.380653372 podStartE2EDuration="25.981630742s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.01231927 +0000 UTC m=+2.760321206" lastFinishedPulling="2026-04-23 13:31:38.613296641 +0000 UTC m=+19.361298576" observedRunningTime="2026-04-23 13:31:44.9803244 +0000 UTC m=+25.728326354" watchObservedRunningTime="2026-04-23 13:31:44.981630742 +0000 UTC m=+25.729632717" Apr 23 13:31:45.932923 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:45.932889 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:45.947502 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:45.947480 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:31:46.197235 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:46.197032 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ms5b6"] Apr 23 13:31:46.197400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:46.197311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:46.197468 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:46.197445 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:46.197799 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:46.197772 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pzpnn"] Apr 23 13:31:46.197901 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:46.197893 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:46.198023 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:46.197987 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:46.935758 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:46.935722 2576 generic.go:358] "Generic (PLEG): container finished" podID="58b92c1d-fc85-4d19-82d4-79f878c270ce" containerID="d419a6d8de8cade241bd278ea7ced0b9e9db1a5868157eb7e6b155f7046de180" exitCode=0 Apr 23 13:31:46.936113 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:46.935812 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerDied","Data":"d419a6d8de8cade241bd278ea7ced0b9e9db1a5868157eb7e6b155f7046de180"} Apr 23 13:31:47.820865 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:47.820838 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:47.820983 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:47.820866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:47.820983 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:47.820968 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:47.821089 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:47.821067 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:48.940754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:48.940720 2576 generic.go:358] "Generic (PLEG): container finished" podID="58b92c1d-fc85-4d19-82d4-79f878c270ce" containerID="f08fa97c59f046965a196c216343484c78e1338b8feca9f0d91fc74209fc4b15" exitCode=0 Apr 23 13:31:48.941582 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:48.940766 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerDied","Data":"f08fa97c59f046965a196c216343484c78e1338b8feca9f0d91fc74209fc4b15"} Apr 23 13:31:49.822154 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:49.822119 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:49.822305 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:49.822235 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:31:49.822395 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:49.822305 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:49.822455 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:49.822417 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="network is not ready: container runtime network not ready: NetworkReady=false reason:NetworkPluginNotReady message:Network plugin returns error: no CNI configuration file in /etc/kubernetes/cni/net.d/. Has your network provider started?" pod="openshift-network-diagnostics/network-check-target-pzpnn" podUID="2a7db36c-22a7-4fed-ba55-f60113c7ad0c" Apr 23 13:31:50.594290 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.594264 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeReady" Apr 23 13:31:50.594762 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.594420 2576 kubelet_node_status.go:550] "Fast updating node status as it just became ready" Apr 23 13:31:50.611215 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.611187 2576 kubelet_node_status.go:736] "Recording event message for node" node="ip-10-0-135-229.ec2.internal" event="NodeReady" Apr 23 13:31:50.642894 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.642870 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress-canary/ingress-canary-cq9bf"] Apr 23 13:31:50.669736 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.669713 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-dns/dns-default-678lb"] Apr 23 13:31:50.669889 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.669807 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:50.672620 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.672597 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"canary-serving-cert\"" Apr 23 13:31:50.672714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.672622 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"openshift-service-ca.crt\"" Apr 23 13:31:50.672714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.672633 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pj4z2\"" Apr 23 13:31:50.672714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.672623 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress-canary\"/\"kube-root-ca.crt\"" Apr 23 13:31:50.680720 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.680703 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cq9bf"] Apr 23 13:31:50.680820 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.680728 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-678lb"] Apr 23 13:31:50.680880 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.680834 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.683736 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.683719 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-default-metrics-tls\"" Apr 23 13:31:50.683802 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.683757 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-dns\"/\"dns-default\"" Apr 23 13:31:50.683985 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.683972 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gtr2j\"" Apr 23 13:31:50.830486 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.830454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0283a7-9d03-45eb-8654-fca71445e53e-config-volume\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.830638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.830517 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sdkz4\" (UniqueName: \"kubernetes.io/projected/c01c5eac-4623-474c-9b1f-de78e668fd57-kube-api-access-sdkz4\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:50.830638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.830559 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:50.830638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.830582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa0283a7-9d03-45eb-8654-fca71445e53e-tmp-dir\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.830638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.830606 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4v7lx\" (UniqueName: \"kubernetes.io/projected/fa0283a7-9d03-45eb-8654-fca71445e53e-kube-api-access-4v7lx\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.830811 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.830694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.931984 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.931907 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:50.931984 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.931954 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa0283a7-9d03-45eb-8654-fca71445e53e-tmp-dir\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.931984 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.931981 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4v7lx\" (UniqueName: \"kubernetes.io/projected/fa0283a7-9d03-45eb-8654-fca71445e53e-kube-api-access-4v7lx\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.932248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.932012 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.932248 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:50.932026 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:50.932248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.932044 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0283a7-9d03-45eb-8654-fca71445e53e-config-volume\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.932248 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:50.932092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:51.432074496 +0000 UTC m=+32.180076449 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:31:50.932248 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:50.932189 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:50.932248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.932209 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sdkz4\" (UniqueName: \"kubernetes.io/projected/c01c5eac-4623-474c-9b1f-de78e668fd57-kube-api-access-sdkz4\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:50.932523 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:50.932259 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:51.432242051 +0000 UTC m=+32.180243990 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:31:50.932523 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.932365 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/fa0283a7-9d03-45eb-8654-fca71445e53e-tmp-dir\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.932750 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.932731 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/configmap/fa0283a7-9d03-45eb-8654-fca71445e53e-config-volume\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:50.942722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.942698 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sdkz4\" (UniqueName: \"kubernetes.io/projected/c01c5eac-4623-474c-9b1f-de78e668fd57-kube-api-access-sdkz4\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:50.951026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:50.951007 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4v7lx\" (UniqueName: \"kubernetes.io/projected/fa0283a7-9d03-45eb-8654-fca71445e53e-kube-api-access-4v7lx\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:51.435625 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.435592 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:51.435812 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.435634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:51.435812 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:51.435751 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:51.435812 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:51.435754 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:51.435943 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:51.435816 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:52.435802023 +0000 UTC m=+33.183803956 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:31:51.435943 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:51.435835 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:52.435824887 +0000 UTC m=+33.183826822 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:31:51.820632 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.820600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:51.821843 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.821381 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:51.823871 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.823851 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-network-diagnostics\"/\"default-dockercfg-5zcnp\"" Apr 23 13:31:51.825024 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.824998 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"openshift-service-ca.crt\"" Apr 23 13:31:51.825134 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.825004 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-secret\"" Apr 23 13:31:51.825134 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.825041 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lbjnn\"" Apr 23 13:31:51.825235 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:51.825005 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-network-diagnostics\"/\"kube-root-ca.crt\"" Apr 23 13:31:52.443735 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:52.443568 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:52.443904 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:52.443751 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:52.443904 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:52.443705 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:52.443904 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:52.443849 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.443834542 +0000 UTC m=+35.191836479 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:31:52.443904 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:52.443854 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:52.443904 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:52.443894 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:54.443882967 +0000 UTC m=+35.191884900 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:31:53.452270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:53.452229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:53.452726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:53.452290 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:31:53.452726 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:53.452408 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:31:53.452726 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:53.452470 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:32:25.452452781 +0000 UTC m=+66.200454719 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : secret "metrics-daemon-secret" not found Apr 23 13:31:53.455570 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:53.455541 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4qksw\" (UniqueName: \"kubernetes.io/projected/2a7db36c-22a7-4fed-ba55-f60113c7ad0c-kube-api-access-4qksw\") pod \"network-check-target-pzpnn\" (UID: \"2a7db36c-22a7-4fed-ba55-f60113c7ad0c\") " pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:53.634989 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:53.634955 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:54.459744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:54.459709 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:54.460083 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:54.459770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:54.460083 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:54.459860 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:54.460083 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:54.459862 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:54.460083 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:54.459914 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.459900348 +0000 UTC m=+39.207902299 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:31:54.460083 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:54.459926 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:31:58.459920827 +0000 UTC m=+39.207922760 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:31:54.623887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:54.623845 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-network-diagnostics/network-check-target-pzpnn"] Apr 23 13:31:54.719497 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:31:54.719445 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2a7db36c_22a7_4fed_ba55_f60113c7ad0c.slice/crio-2dc24744ad6e500938a3c6a1146b4e099c48f5d256febf9f10f076316abc6f57 WatchSource:0}: Error finding container 2dc24744ad6e500938a3c6a1146b4e099c48f5d256febf9f10f076316abc6f57: Status 404 returned error can't find the container with id 2dc24744ad6e500938a3c6a1146b4e099c48f5d256febf9f10f076316abc6f57 Apr 23 13:31:54.956641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:54.956609 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerStarted","Data":"e7388cc36c654a63f346506cce82273ac9b34a9d33ea814fb09767010497f530"} Apr 23 13:31:54.957600 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:54.957574 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pzpnn" event={"ID":"2a7db36c-22a7-4fed-ba55-f60113c7ad0c","Type":"ContainerStarted","Data":"2dc24744ad6e500938a3c6a1146b4e099c48f5d256febf9f10f076316abc6f57"} Apr 23 13:31:55.962388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:55.962214 2576 generic.go:358] "Generic (PLEG): container finished" podID="58b92c1d-fc85-4d19-82d4-79f878c270ce" containerID="e7388cc36c654a63f346506cce82273ac9b34a9d33ea814fb09767010497f530" exitCode=0 Apr 23 13:31:55.962783 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:55.962289 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerDied","Data":"e7388cc36c654a63f346506cce82273ac9b34a9d33ea814fb09767010497f530"} Apr 23 13:31:56.967563 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:56.967532 2576 generic.go:358] "Generic (PLEG): container finished" podID="58b92c1d-fc85-4d19-82d4-79f878c270ce" containerID="1d75c3d07a3caeab4b8b63b4affd1da2cfdaf34ab27a071047ff6ab4ba13c094" exitCode=0 Apr 23 13:31:56.967905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:56.967595 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerDied","Data":"1d75c3d07a3caeab4b8b63b4affd1da2cfdaf34ab27a071047ff6ab4ba13c094"} Apr 23 13:31:58.491434 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:58.491406 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:31:58.491735 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:58.491458 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:31:58.491735 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:58.491541 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:31:58.491735 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:58.491542 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:31:58.491735 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:58.491582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:06.49156976 +0000 UTC m=+47.239571693 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:31:58.491735 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:31:58.491595 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:32:06.491589111 +0000 UTC m=+47.239591044 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:31:58.973447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:58.973415 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" event={"ID":"58b92c1d-fc85-4d19-82d4-79f878c270ce","Type":"ContainerStarted","Data":"bdbb4257f118e9a1eba35756fa927953aceb5160203c6b481ccef5c08a27b9d4"} Apr 23 13:31:58.974530 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:58.974508 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-network-diagnostics/network-check-target-pzpnn" event={"ID":"2a7db36c-22a7-4fed-ba55-f60113c7ad0c","Type":"ContainerStarted","Data":"d53c11aa2c57c5e32edd7f00a12bb44bbe464f7b03b1f50332fe5622c79bfc29"} Apr 23 13:31:58.974659 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:58.974646 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:31:58.998396 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:58.998022 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/multus-additional-cni-plugins-b6xwg" podStartSLOduration=7.289620992 podStartE2EDuration="39.998006617s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:22.039138081 +0000 UTC m=+2.787140027" lastFinishedPulling="2026-04-23 13:31:54.747523704 +0000 UTC m=+35.495525652" observedRunningTime="2026-04-23 13:31:58.995820791 +0000 UTC m=+39.743822746" watchObservedRunningTime="2026-04-23 13:31:58.998006617 +0000 UTC m=+39.746008574" Apr 23 13:31:59.010799 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:31:59.010753 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-network-diagnostics/network-check-target-pzpnn" podStartSLOduration=36.46479994 podStartE2EDuration="40.010741105s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:31:54.72612352 +0000 UTC m=+35.474125454" lastFinishedPulling="2026-04-23 13:31:58.272064685 +0000 UTC m=+39.020066619" observedRunningTime="2026-04-23 13:31:59.010589534 +0000 UTC m=+39.758591504" watchObservedRunningTime="2026-04-23 13:31:59.010741105 +0000 UTC m=+39.758743057" Apr 23 13:32:06.538653 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:06.538613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:32:06.538653 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:06.538660 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:32:06.539143 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:06.538755 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:06.539143 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:06.538760 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:06.539143 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:06.538807 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.538794559 +0000 UTC m=+63.286796492 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:32:06.539143 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:06.538821 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:22.53881465 +0000 UTC m=+63.286816583 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:32:17.949613 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:17.949587 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ovn-kubernetes/ovnkube-node-6mgfk" Apr 23 13:32:22.541087 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:22.541049 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:32:22.541481 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:22.541106 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:32:22.541481 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:22.541179 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:22.541481 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:22.541235 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:32:54.541220187 +0000 UTC m=+95.289222120 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:32:22.541481 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:22.541185 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:22.541481 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:22.541318 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:32:54.541306723 +0000 UTC m=+95.289308656 (durationBeforeRetry 32s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:32:25.458475 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:25.458437 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:32:25.458838 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:25.458544 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:32:25.458838 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:25.458599 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:33:29.45858473 +0000 UTC m=+130.206586663 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : secret "metrics-daemon-secret" not found Apr 23 13:32:29.978381 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:29.978355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-network-diagnostics/network-check-target-pzpnn" Apr 23 13:32:54.638507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:54.638358 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:32:54.638507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:32:54.638403 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:32:54.639164 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:54.638511 2576 secret.go:189] Couldn't get secret openshift-dns/dns-default-metrics-tls: secret "dns-default-metrics-tls" not found Apr 23 13:32:54.639164 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:54.638518 2576 secret.go:189] Couldn't get secret openshift-ingress-canary/canary-serving-cert: secret "canary-serving-cert" not found Apr 23 13:32:54.639164 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:54.638582 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls podName:fa0283a7-9d03-45eb-8654-fca71445e53e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:58.638565792 +0000 UTC m=+159.386567725 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "metrics-tls" (UniqueName: "kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls") pod "dns-default-678lb" (UID: "fa0283a7-9d03-45eb-8654-fca71445e53e") : secret "dns-default-metrics-tls" not found Apr 23 13:32:54.639164 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:32:54.638609 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert podName:c01c5eac-4623-474c-9b1f-de78e668fd57 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:58.638588392 +0000 UTC m=+159.386590342 (durationBeforeRetry 1m4s). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert") pod "ingress-canary-cq9bf" (UID: "c01c5eac-4623-474c-9b1f-de78e668fd57") : secret "canary-serving-cert" not found Apr 23 13:33:28.625535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.625504 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l"] Apr 23 13:33:28.627259 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.627243 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" Apr 23 13:33:28.628659 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.628635 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf"] Apr 23 13:33:28.629667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.629647 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-storage-operator\"/\"volume-data-source-validator-dockercfg-2s9s2\"" Apr 23 13:33:28.630188 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.630171 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.630628 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.630600 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:28.630703 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.630653 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-storage-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:28.632598 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.632580 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-tls\"" Apr 23 13:33:28.632687 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.632608 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kube-root-ca.crt\"" Apr 23 13:33:28.633441 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.633420 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"openshift-service-ca.crt\"" Apr 23 13:33:28.634112 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.634093 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"cluster-monitoring-operator-dockercfg-b4n2b\"" Apr 23 13:33:28.634213 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.634194 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemetry-config\"" Apr 23 13:33:28.637885 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.637866 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l"] Apr 23 13:33:28.651042 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.651015 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf"] Apr 23 13:33:28.658100 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.658081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-548c6\" (UniqueName: \"kubernetes.io/projected/60cc1ca9-d0ba-4394-a632-3383e1572f9d-kube-api-access-548c6\") pod \"volume-data-source-validator-7c6cbb6c87-cbx2l\" (UID: \"60cc1ca9-d0ba-4394-a632-3383e1572f9d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" Apr 23 13:33:28.733713 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.733687 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-ingress/router-default-7cf45c9b8b-qfgth"] Apr 23 13:33:28.735477 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.735465 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.737763 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.737742 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"service-ca-bundle\"" Apr 23 13:33:28.738010 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.737990 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"default-ingress-cert\"" Apr 23 13:33:28.738118 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.738017 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-stats-default\"" Apr 23 13:33:28.738118 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.738059 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"openshift-service-ca.crt\"" Apr 23 13:33:28.738118 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.738100 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-ingress\"/\"kube-root-ca.crt\"" Apr 23 13:33:28.738270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.738195 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-dockercfg-9jzzd\"" Apr 23 13:33:28.738270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.738196 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress\"/\"router-metrics-certs-default\"" Apr 23 13:33:28.749345 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.749311 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7cf45c9b8b-qfgth"] Apr 23 13:33:28.758822 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.758805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5c6mw\" (UniqueName: \"kubernetes.io/projected/12ced166-dcdf-4e6f-9ff5-77d972bf0902-kube-api-access-5c6mw\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.758914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.758841 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.758914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.758871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-548c6\" (UniqueName: \"kubernetes.io/projected/60cc1ca9-d0ba-4394-a632-3383e1572f9d-kube-api-access-548c6\") pod \"volume-data-source-validator-7c6cbb6c87-cbx2l\" (UID: \"60cc1ca9-d0ba-4394-a632-3383e1572f9d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" Apr 23 13:33:28.759019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.758930 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/12ced166-dcdf-4e6f-9ff5-77d972bf0902-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.772750 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.772723 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-548c6\" (UniqueName: \"kubernetes.io/projected/60cc1ca9-d0ba-4394-a632-3383e1572f9d-kube-api-access-548c6\") pod \"volume-data-source-validator-7c6cbb6c87-cbx2l\" (UID: \"60cc1ca9-d0ba-4394-a632-3383e1572f9d\") " pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" Apr 23 13:33:28.830400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.830378 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx"] Apr 23 13:33:28.832089 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.832076 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:28.834341 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.834313 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"samples-operator-tls\"" Apr 23 13:33:28.834427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.834369 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:28.834427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.834390 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-cluster-samples-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:28.834653 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.834637 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-cluster-samples-operator\"/\"cluster-samples-operator-dockercfg-w94bs\"" Apr 23 13:33:28.842982 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.842966 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx"] Apr 23 13:33:28.860091 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-stats-auth\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.860185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860114 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5c6mw\" (UniqueName: \"kubernetes.io/projected/12ced166-dcdf-4e6f-9ff5-77d972bf0902-kube-api-access-5c6mw\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.860185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860136 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.860287 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860185 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/12ced166-dcdf-4e6f-9ff5-77d972bf0902-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.860287 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:28.860199 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:28.860287 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860222 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.860287 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:28.860245 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls podName:12ced166-dcdf-4e6f-9ff5-77d972bf0902 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:29.36023375 +0000 UTC m=+130.108235683 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kh6vf" (UID: "12ced166-dcdf-4e6f-9ff5-77d972bf0902") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:28.860507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860358 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.860507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860381 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-default-certificate\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.860507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fgqnw\" (UniqueName: \"kubernetes.io/projected/de51ff65-d9f2-40df-b830-2cd1d95fe71e-kube-api-access-fgqnw\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.860863 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.860844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemetry-config\" (UniqueName: \"kubernetes.io/configmap/12ced166-dcdf-4e6f-9ff5-77d972bf0902-telemetry-config\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.870070 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.870043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5c6mw\" (UniqueName: \"kubernetes.io/projected/12ced166-dcdf-4e6f-9ff5-77d972bf0902-kube-api-access-5c6mw\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:28.936935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.936880 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" Apr 23 13:33:28.960813 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.960784 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rp5d9\" (UniqueName: \"kubernetes.io/projected/99d2a07d-bb82-43fb-bc08-69f12aecb492-kube-api-access-rp5d9\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:28.960932 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.960831 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.960932 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.960847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.960932 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.960863 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-default-certificate\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.960932 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.960884 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fgqnw\" (UniqueName: \"kubernetes.io/projected/de51ff65-d9f2-40df-b830-2cd1d95fe71e-kube-api-access-fgqnw\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.961135 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:28.960967 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:28.961135 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:28.961012 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:29.46099089 +0000 UTC m=+130.208992823 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:28.961135 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.961072 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-stats-auth\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.961135 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:28.961093 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:29.461073543 +0000 UTC m=+130.209075484 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : secret "router-metrics-certs-default" not found Apr 23 13:33:28.961322 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.961153 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:28.963186 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.963168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"stats-auth\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-stats-auth\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.963354 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.963317 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"default-certificate\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-default-certificate\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:28.969935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:28.969898 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fgqnw\" (UniqueName: \"kubernetes.io/projected/de51ff65-d9f2-40df-b830-2cd1d95fe71e-kube-api-access-fgqnw\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:29.059915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.059886 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l"] Apr 23 13:33:29.062131 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.062109 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rp5d9\" (UniqueName: \"kubernetes.io/projected/99d2a07d-bb82-43fb-bc08-69f12aecb492-kube-api-access-rp5d9\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:29.062219 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.062205 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:29.062315 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.062303 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:29.062382 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.062369 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls podName:99d2a07d-bb82-43fb-bc08-69f12aecb492 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:29.562355483 +0000 UTC m=+130.310357416 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4nbzx" (UID: "99d2a07d-bb82-43fb-bc08-69f12aecb492") : secret "samples-operator-tls" not found Apr 23 13:33:29.062890 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:33:29.062867 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod60cc1ca9_d0ba_4394_a632_3383e1572f9d.slice/crio-110e9720019ab5e3ac0977b76389adc5e05a45d0f6c0116eb582d2c3ea57db24 WatchSource:0}: Error finding container 110e9720019ab5e3ac0977b76389adc5e05a45d0f6c0116eb582d2c3ea57db24: Status 404 returned error can't find the container with id 110e9720019ab5e3ac0977b76389adc5e05a45d0f6c0116eb582d2c3ea57db24 Apr 23 13:33:29.070245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.070217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rp5d9\" (UniqueName: \"kubernetes.io/projected/99d2a07d-bb82-43fb-bc08-69f12aecb492-kube-api-access-rp5d9\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:29.138400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.138377 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" event={"ID":"60cc1ca9-d0ba-4394-a632-3383e1572f9d","Type":"ContainerStarted","Data":"110e9720019ab5e3ac0977b76389adc5e05a45d0f6c0116eb582d2c3ea57db24"} Apr 23 13:33:29.364313 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.364284 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:29.364486 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.364397 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:29.364486 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.364450 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls podName:12ced166-dcdf-4e6f-9ff5-77d972bf0902 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:30.364436276 +0000 UTC m=+131.112438209 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kh6vf" (UID: "12ced166-dcdf-4e6f-9ff5-77d972bf0902") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:29.465042 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.465017 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:29.465162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.465046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:29.465162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.465065 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:33:29.465270 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.465169 2576 secret.go:189] Couldn't get secret openshift-multus/metrics-daemon-secret: secret "metrics-daemon-secret" not found Apr 23 13:33:29.465270 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.465180 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:29.465270 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.465170 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:30.465154896 +0000 UTC m=+131.213156829 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:29.465270 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.465212 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs podName:6bc685f5-9bf7-4830-a9ad-e4622169dcdb nodeName:}" failed. No retries permitted until 2026-04-23 13:35:31.465201435 +0000 UTC m=+252.213203368 (durationBeforeRetry 2m2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs") pod "network-metrics-daemon-ms5b6" (UID: "6bc685f5-9bf7-4830-a9ad-e4622169dcdb") : secret "metrics-daemon-secret" not found Apr 23 13:33:29.465270 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.465234 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:30.465217186 +0000 UTC m=+131.213219123 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : secret "router-metrics-certs-default" not found Apr 23 13:33:29.566344 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.566297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:29.566476 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.566435 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:29.566521 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:29.566493 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls podName:99d2a07d-bb82-43fb-bc08-69f12aecb492 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:30.566480022 +0000 UTC m=+131.314481955 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4nbzx" (UID: "99d2a07d-bb82-43fb-bc08-69f12aecb492") : secret "samples-operator-tls" not found Apr 23 13:33:29.629130 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.629065 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8wz65"] Apr 23 13:33:29.631491 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.631474 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.634120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.634101 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"serving-cert\"" Apr 23 13:33:29.634215 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.634174 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console-operator\"/\"console-operator-dockercfg-pkzrp\"" Apr 23 13:33:29.634215 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.634181 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"kube-root-ca.crt\"" Apr 23 13:33:29.634351 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.634338 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"console-operator-config\"" Apr 23 13:33:29.634438 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.634421 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:33:29.642938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.642916 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console-operator\"/\"trusted-ca\"" Apr 23 13:33:29.643031 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.642965 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8wz65"] Apr 23 13:33:29.667432 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.667411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9bc6z\" (UniqueName: \"kubernetes.io/projected/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-kube-api-access-9bc6z\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.667530 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.667446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-serving-cert\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.667530 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.667506 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-config\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.667647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.667631 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-trusted-ca\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.768539 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.768506 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-config\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.768690 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.768603 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-trusted-ca\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.768690 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.768659 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9bc6z\" (UniqueName: \"kubernetes.io/projected/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-kube-api-access-9bc6z\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.768780 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.768689 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-serving-cert\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.769270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.769242 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config\" (UniqueName: \"kubernetes.io/configmap/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-config\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.769919 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.769889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-trusted-ca\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.771190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.771168 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-cert\" (UniqueName: \"kubernetes.io/secret/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-serving-cert\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.779075 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.779053 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9bc6z\" (UniqueName: \"kubernetes.io/projected/bbd5d80a-4c64-4b49-a070-1d5161e7afc9-kube-api-access-9bc6z\") pod \"console-operator-9d4b6777b-8wz65\" (UID: \"bbd5d80a-4c64-4b49-a070-1d5161e7afc9\") " pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:29.944390 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:29.944311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:30.064883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.064853 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console-operator/console-operator-9d4b6777b-8wz65"] Apr 23 13:33:30.068698 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:33:30.068670 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podbbd5d80a_4c64_4b49_a070_1d5161e7afc9.slice/crio-ee1bf55769434c9baf7479ac712188dc9aa6e3fbb21f9fac39ae243a94c8c4c9 WatchSource:0}: Error finding container ee1bf55769434c9baf7479ac712188dc9aa6e3fbb21f9fac39ae243a94c8c4c9: Status 404 returned error can't find the container with id ee1bf55769434c9baf7479ac712188dc9aa6e3fbb21f9fac39ae243a94c8c4c9 Apr 23 13:33:30.140908 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.140876 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" event={"ID":"bbd5d80a-4c64-4b49-a070-1d5161e7afc9","Type":"ContainerStarted","Data":"ee1bf55769434c9baf7479ac712188dc9aa6e3fbb21f9fac39ae243a94c8c4c9"} Apr 23 13:33:30.373316 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.373263 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:30.373493 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.373417 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:30.373566 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.373501 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls podName:12ced166-dcdf-4e6f-9ff5-77d972bf0902 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.373480944 +0000 UTC m=+133.121482892 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kh6vf" (UID: "12ced166-dcdf-4e6f-9ff5-77d972bf0902") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:30.473939 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.473915 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:30.474061 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.473945 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:30.474061 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.474045 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:30.474123 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.474092 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.47407555 +0000 UTC m=+133.222077483 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : secret "router-metrics-certs-default" not found Apr 23 13:33:30.474123 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.474109 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.474098394 +0000 UTC m=+133.222100335 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:30.575528 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.575490 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:30.575689 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.575655 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:30.575785 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.575770 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls podName:99d2a07d-bb82-43fb-bc08-69f12aecb492 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.575720839 +0000 UTC m=+133.323722784 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4nbzx" (UID: "99d2a07d-bb82-43fb-bc08-69f12aecb492") : secret "samples-operator-tls" not found Apr 23 13:33:30.697767 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.697688 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-857b476854-crbr6"] Apr 23 13:33:30.699750 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.699731 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.702456 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.702382 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-tls\"" Apr 23 13:33:30.702456 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.702436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"registry-dockercfg-fxpfk\"" Apr 23 13:33:30.702627 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.702439 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"image-registry-private-configuration\"" Apr 23 13:33:30.702627 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.702436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-image-registry\"/\"installation-pull-secrets\"" Apr 23 13:33:30.707409 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.707388 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-image-registry\"/\"trusted-ca\"" Apr 23 13:33:30.714321 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.714301 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-857b476854-crbr6"] Apr 23 13:33:30.777813 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.777789 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-installation-pull-secrets\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.777963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.777842 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-image-registry-private-configuration\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.777963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.777871 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.777963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.777896 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-ca-trust-extracted\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.778119 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.777977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-trusted-ca\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.778119 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.778010 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-bound-sa-token\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.778119 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.778076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-certificates\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.778119 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.778101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lxlq2\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-kube-api-access-lxlq2\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878586 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-image-registry-private-configuration\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878758 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878596 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878758 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878615 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-ca-trust-extracted\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878758 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.878716 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:30.878758 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.878739 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-857b476854-crbr6: secret "image-registry-tls" not found Apr 23 13:33:30.878963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878787 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-trusted-ca\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878816 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-bound-sa-token\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878963 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:30.878845 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls podName:b1b028da-d16a-4e08-8ffb-7da1255d3fc8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:31.378822063 +0000 UTC m=+132.126824012 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls") pod "image-registry-857b476854-crbr6" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8") : secret "image-registry-tls" not found Apr 23 13:33:30.878963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878900 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-certificates\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.878963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.878927 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lxlq2\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-kube-api-access-lxlq2\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.879222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.879103 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-installation-pull-secrets\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.879647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.879612 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-ca-trust-extracted\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.880413 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.880386 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-certificates\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.880828 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.880808 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-trusted-ca\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.881926 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.881874 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-installation-pull-secrets\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.882037 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.881941 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-image-registry-private-configuration\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.887089 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.887064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-bound-sa-token\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:30.887400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:30.887379 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lxlq2\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-kube-api-access-lxlq2\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:31.143574 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:31.143544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" event={"ID":"60cc1ca9-d0ba-4394-a632-3383e1572f9d","Type":"ContainerStarted","Data":"4eb828ff0f76c259e4d95b3dc6a8d6aa4e6001a023bf1a6808a855cfaf4c35c1"} Apr 23 13:33:31.159470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:31.159430 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-storage-operator/volume-data-source-validator-7c6cbb6c87-cbx2l" podStartSLOduration=1.8070692080000001 podStartE2EDuration="3.159417851s" podCreationTimestamp="2026-04-23 13:33:28 +0000 UTC" firstStartedPulling="2026-04-23 13:33:29.064553899 +0000 UTC m=+129.812555832" lastFinishedPulling="2026-04-23 13:33:30.416902542 +0000 UTC m=+131.164904475" observedRunningTime="2026-04-23 13:33:31.159140025 +0000 UTC m=+131.907141979" watchObservedRunningTime="2026-04-23 13:33:31.159417851 +0000 UTC m=+131.907419806" Apr 23 13:33:31.383543 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:31.383508 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:31.383737 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:31.383714 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:31.383790 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:31.383740 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-857b476854-crbr6: secret "image-registry-tls" not found Apr 23 13:33:31.383827 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:31.383810 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls podName:b1b028da-d16a-4e08-8ffb-7da1255d3fc8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:32.383789241 +0000 UTC m=+133.131791179 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls") pod "image-registry-857b476854-crbr6" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8") : secret "image-registry-tls" not found Apr 23 13:33:32.146862 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.146835 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/0.log" Apr 23 13:33:32.147212 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.146874 2576 generic.go:358] "Generic (PLEG): container finished" podID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" containerID="d28dc0dae758f679aa12aaa2e575250eca8b3902d1a46860a62a20faa43cc4a8" exitCode=255 Apr 23 13:33:32.147212 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.146907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" event={"ID":"bbd5d80a-4c64-4b49-a070-1d5161e7afc9","Type":"ContainerDied","Data":"d28dc0dae758f679aa12aaa2e575250eca8b3902d1a46860a62a20faa43cc4a8"} Apr 23 13:33:32.147212 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.147125 2576 scope.go:117] "RemoveContainer" containerID="d28dc0dae758f679aa12aaa2e575250eca8b3902d1a46860a62a20faa43cc4a8" Apr 23 13:33:32.391673 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.391646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:32.391810 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.391695 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:32.391810 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.391802 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:32.391913 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.391817 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:32.391913 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.391822 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-857b476854-crbr6: secret "image-registry-tls" not found Apr 23 13:33:32.391913 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.391870 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls podName:12ced166-dcdf-4e6f-9ff5-77d972bf0902 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:36.391853939 +0000 UTC m=+137.139855881 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kh6vf" (UID: "12ced166-dcdf-4e6f-9ff5-77d972bf0902") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:32.391913 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.391883 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls podName:b1b028da-d16a-4e08-8ffb-7da1255d3fc8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:34.39187729 +0000 UTC m=+135.139879223 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls") pod "image-registry-857b476854-crbr6" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8") : secret "image-registry-tls" not found Apr 23 13:33:32.492494 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.492434 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:32.492494 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.492463 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:32.492636 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.492554 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:32.492636 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.492574 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:36.492558727 +0000 UTC m=+137.240560660 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:32.492636 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.492592 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:36.492585551 +0000 UTC m=+137.240587484 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : secret "router-metrics-certs-default" not found Apr 23 13:33:32.593938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:32.593901 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:32.594075 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.594017 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:32.594075 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:32.594060 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls podName:99d2a07d-bb82-43fb-bc08-69f12aecb492 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:36.594048862 +0000 UTC m=+137.342050795 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4nbzx" (UID: "99d2a07d-bb82-43fb-bc08-69f12aecb492") : secret "samples-operator-tls" not found Apr 23 13:33:33.149926 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.149895 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/1.log" Apr 23 13:33:33.150361 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.150258 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/0.log" Apr 23 13:33:33.150361 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.150298 2576 generic.go:358] "Generic (PLEG): container finished" podID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" containerID="2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d" exitCode=255 Apr 23 13:33:33.150471 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.150373 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" event={"ID":"bbd5d80a-4c64-4b49-a070-1d5161e7afc9","Type":"ContainerDied","Data":"2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d"} Apr 23 13:33:33.150471 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.150427 2576 scope.go:117] "RemoveContainer" containerID="d28dc0dae758f679aa12aaa2e575250eca8b3902d1a46860a62a20faa43cc4a8" Apr 23 13:33:33.150566 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.150553 2576 scope.go:117] "RemoveContainer" containerID="2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d" Apr 23 13:33:33.150754 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:33.150738 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8wz65_openshift-console-operator(bbd5d80a-4c64-4b49-a070-1d5161e7afc9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podUID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" Apr 23 13:33:33.981759 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:33.981733 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-89qsg_8e915a91-9dcb-4454-9ac4-0012727f6bdd/dns-node-resolver/0.log" Apr 23 13:33:34.153469 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:34.153445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/1.log" Apr 23 13:33:34.153829 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:34.153746 2576 scope.go:117] "RemoveContainer" containerID="2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d" Apr 23 13:33:34.153902 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:34.153887 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8wz65_openshift-console-operator(bbd5d80a-4c64-4b49-a070-1d5161e7afc9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podUID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" Apr 23 13:33:34.408150 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:34.408115 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:34.408288 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:34.408258 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:34.408288 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:34.408274 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-857b476854-crbr6: secret "image-registry-tls" not found Apr 23 13:33:34.408394 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:34.408324 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls podName:b1b028da-d16a-4e08-8ffb-7da1255d3fc8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:38.408309473 +0000 UTC m=+139.156311406 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls") pod "image-registry-857b476854-crbr6" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8") : secret "image-registry-tls" not found Apr 23 13:33:35.181260 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:35.181234 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7g8xg_fccbedc6-6cb0-47bb-8b72-95f91484d090/node-ca/0.log" Apr 23 13:33:36.423081 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:36.423046 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:36.423455 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.423186 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:36.423455 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.423252 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls podName:12ced166-dcdf-4e6f-9ff5-77d972bf0902 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:44.423236654 +0000 UTC m=+145.171238587 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kh6vf" (UID: "12ced166-dcdf-4e6f-9ff5-77d972bf0902") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:36.523594 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:36.523558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:36.523594 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:36.523599 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:36.523738 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.523696 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:36.523738 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.523717 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:44.523702267 +0000 UTC m=+145.271704200 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:36.523738 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.523738 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:33:44.523726933 +0000 UTC m=+145.271728866 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : secret "router-metrics-certs-default" not found Apr 23 13:33:36.624250 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:36.624222 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:36.624380 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.624344 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:36.624424 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:36.624388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls podName:99d2a07d-bb82-43fb-bc08-69f12aecb492 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:44.624377147 +0000 UTC m=+145.372379080 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4nbzx" (UID: "99d2a07d-bb82-43fb-bc08-69f12aecb492") : secret "samples-operator-tls" not found Apr 23 13:33:38.437822 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:38.437792 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:38.438184 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:38.437896 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:38.438184 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:38.437917 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-857b476854-crbr6: secret "image-registry-tls" not found Apr 23 13:33:38.438184 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:38.437967 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls podName:b1b028da-d16a-4e08-8ffb-7da1255d3fc8 nodeName:}" failed. No retries permitted until 2026-04-23 13:33:46.437954284 +0000 UTC m=+147.185956217 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls") pod "image-registry-857b476854-crbr6" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8") : secret "image-registry-tls" not found Apr 23 13:33:39.944571 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:39.944544 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:39.944571 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:39.944576 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:39.944945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:39.944879 2576 scope.go:117] "RemoveContainer" containerID="2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d" Apr 23 13:33:39.945044 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:39.945029 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 10s restarting failed container=console-operator pod=console-operator-9d4b6777b-8wz65_openshift-console-operator(bbd5d80a-4c64-4b49-a070-1d5161e7afc9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podUID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" Apr 23 13:33:41.492730 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.492700 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-service-ca/service-ca-865cb79987-rg2cw"] Apr 23 13:33:41.494406 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.494391 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.496624 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.496602 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"signing-key\"" Apr 23 13:33:41.496881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.496863 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"signing-cabundle\"" Apr 23 13:33:41.496964 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.496874 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"openshift-service-ca.crt\"" Apr 23 13:33:41.498015 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.497997 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-service-ca\"/\"service-ca-dockercfg-tfllh\"" Apr 23 13:33:41.498086 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.497999 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-service-ca\"/\"kube-root-ca.crt\"" Apr 23 13:33:41.506868 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.506847 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-rg2cw"] Apr 23 13:33:41.562031 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.562006 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9afae374-e716-4c8c-9f0e-7080043bcf4d-signing-cabundle\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.562120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.562068 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wt6s8\" (UniqueName: \"kubernetes.io/projected/9afae374-e716-4c8c-9f0e-7080043bcf4d-kube-api-access-wt6s8\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.562162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.562142 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9afae374-e716-4c8c-9f0e-7080043bcf4d-signing-key\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.662664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.662627 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wt6s8\" (UniqueName: \"kubernetes.io/projected/9afae374-e716-4c8c-9f0e-7080043bcf4d-kube-api-access-wt6s8\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.662749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.662681 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9afae374-e716-4c8c-9f0e-7080043bcf4d-signing-key\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.662749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.662727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9afae374-e716-4c8c-9f0e-7080043bcf4d-signing-cabundle\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.664172 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.664150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-cabundle\" (UniqueName: \"kubernetes.io/configmap/9afae374-e716-4c8c-9f0e-7080043bcf4d-signing-cabundle\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.664963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.664944 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"signing-key\" (UniqueName: \"kubernetes.io/secret/9afae374-e716-4c8c-9f0e-7080043bcf4d-signing-key\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.670595 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.670577 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wt6s8\" (UniqueName: \"kubernetes.io/projected/9afae374-e716-4c8c-9f0e-7080043bcf4d-kube-api-access-wt6s8\") pod \"service-ca-865cb79987-rg2cw\" (UID: \"9afae374-e716-4c8c-9f0e-7080043bcf4d\") " pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.803895 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.803868 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-service-ca/service-ca-865cb79987-rg2cw" Apr 23 13:33:41.910761 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:41.910733 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-service-ca/service-ca-865cb79987-rg2cw"] Apr 23 13:33:41.913518 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:33:41.913488 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9afae374_e716_4c8c_9f0e_7080043bcf4d.slice/crio-f1c29cf9b34f3925aeab26bdf6425e105fe8a8b26c937d7e64bdaf9c27b8ff43 WatchSource:0}: Error finding container f1c29cf9b34f3925aeab26bdf6425e105fe8a8b26c937d7e64bdaf9c27b8ff43: Status 404 returned error can't find the container with id f1c29cf9b34f3925aeab26bdf6425e105fe8a8b26c937d7e64bdaf9c27b8ff43 Apr 23 13:33:42.168197 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:42.168125 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-rg2cw" event={"ID":"9afae374-e716-4c8c-9f0e-7080043bcf4d","Type":"ContainerStarted","Data":"f1c29cf9b34f3925aeab26bdf6425e105fe8a8b26c937d7e64bdaf9c27b8ff43"} Apr 23 13:33:44.172890 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:44.172855 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-service-ca/service-ca-865cb79987-rg2cw" event={"ID":"9afae374-e716-4c8c-9f0e-7080043bcf4d","Type":"ContainerStarted","Data":"b6a46b157b55fa5a29cc22b2860492f7c37d90f394b14ee3f193240c150ce1ef"} Apr 23 13:33:44.190942 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:44.190896 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-service-ca/service-ca-865cb79987-rg2cw" podStartSLOduration=1.368525784 podStartE2EDuration="3.190883708s" podCreationTimestamp="2026-04-23 13:33:41 +0000 UTC" firstStartedPulling="2026-04-23 13:33:41.915205435 +0000 UTC m=+142.663207370" lastFinishedPulling="2026-04-23 13:33:43.73756336 +0000 UTC m=+144.485565294" observedRunningTime="2026-04-23 13:33:44.189152859 +0000 UTC m=+144.937154834" watchObservedRunningTime="2026-04-23 13:33:44.190883708 +0000 UTC m=+144.938885662" Apr 23 13:33:44.485640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:44.485570 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:33:44.485757 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.485703 2576 secret.go:189] Couldn't get secret openshift-monitoring/cluster-monitoring-operator-tls: secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:44.485793 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.485759 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls podName:12ced166-dcdf-4e6f-9ff5-77d972bf0902 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:00.485743456 +0000 UTC m=+161.233745397 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "cluster-monitoring-operator-tls" (UniqueName: "kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls") pod "cluster-monitoring-operator-75587bd455-kh6vf" (UID: "12ced166-dcdf-4e6f-9ff5-77d972bf0902") : secret "cluster-monitoring-operator-tls" not found Apr 23 13:33:44.586718 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:44.586688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:44.586718 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:44.586722 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:33:44.586867 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.586831 2576 secret.go:189] Couldn't get secret openshift-ingress/router-metrics-certs-default: secret "router-metrics-certs-default" not found Apr 23 13:33:44.586867 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.586837 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:34:00.586818623 +0000 UTC m=+161.334820569 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "service-ca-bundle" (UniqueName: "kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : configmap references non-existent config key: service-ca.crt Apr 23 13:33:44.586867 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.586863 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs podName:de51ff65-d9f2-40df-b830-2cd1d95fe71e nodeName:}" failed. No retries permitted until 2026-04-23 13:34:00.586853461 +0000 UTC m=+161.334855394 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "metrics-certs" (UniqueName: "kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs") pod "router-default-7cf45c9b8b-qfgth" (UID: "de51ff65-d9f2-40df-b830-2cd1d95fe71e") : secret "router-metrics-certs-default" not found Apr 23 13:33:44.687212 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:44.687189 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:33:44.687292 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.687265 2576 secret.go:189] Couldn't get secret openshift-cluster-samples-operator/samples-operator-tls: secret "samples-operator-tls" not found Apr 23 13:33:44.687325 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:44.687298 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls podName:99d2a07d-bb82-43fb-bc08-69f12aecb492 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:00.687289846 +0000 UTC m=+161.435291779 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "samples-operator-tls" (UniqueName: "kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls") pod "cluster-samples-operator-6dc5bdb6b4-4nbzx" (UID: "99d2a07d-bb82-43fb-bc08-69f12aecb492") : secret "samples-operator-tls" not found Apr 23 13:33:46.500641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:46.500614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:33:46.501076 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:46.500784 2576 projected.go:264] Couldn't get secret openshift-image-registry/image-registry-tls: secret "image-registry-tls" not found Apr 23 13:33:46.501076 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:46.500807 2576 projected.go:194] Error preparing data for projected volume registry-tls for pod openshift-image-registry/image-registry-857b476854-crbr6: secret "image-registry-tls" not found Apr 23 13:33:46.501076 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:46.500877 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls podName:b1b028da-d16a-4e08-8ffb-7da1255d3fc8 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:02.500856592 +0000 UTC m=+163.248858542 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "registry-tls" (UniqueName: "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls") pod "image-registry-857b476854-crbr6" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8") : secret "image-registry-tls" not found Apr 23 13:33:53.680555 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:53.680520 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[cert], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-ingress-canary/ingress-canary-cq9bf" podUID="c01c5eac-4623-474c-9b1f-de78e668fd57" Apr 23 13:33:53.690802 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:53.690773 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-tls], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-dns/dns-default-678lb" podUID="fa0283a7-9d03-45eb-8654-fca71445e53e" Apr 23 13:33:54.193373 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:54.193345 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:33:54.820989 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:54.820962 2576 scope.go:117] "RemoveContainer" containerID="2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d" Apr 23 13:33:54.841248 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:54.841225 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[metrics-certs], unattached volumes=[], failed to process volumes=[]: context deadline exceeded" pod="openshift-multus/network-metrics-daemon-ms5b6" podUID="6bc685f5-9bf7-4830-a9ad-e4622169dcdb" Apr 23 13:33:55.196584 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:55.196533 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:33:55.196899 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:55.196883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/1.log" Apr 23 13:33:55.196984 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:55.196918 2576 generic.go:358] "Generic (PLEG): container finished" podID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" containerID="4d76995909f54d6636e3838a1764623a3bf39a05a21ecc28fdfdf99461d24b04" exitCode=255 Apr 23 13:33:55.197029 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:55.196987 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" event={"ID":"bbd5d80a-4c64-4b49-a070-1d5161e7afc9","Type":"ContainerDied","Data":"4d76995909f54d6636e3838a1764623a3bf39a05a21ecc28fdfdf99461d24b04"} Apr 23 13:33:55.197029 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:55.197023 2576 scope.go:117] "RemoveContainer" containerID="2212a144a04e9a61800a68cab16e141f965c3a23dee67223a55a7726a54a7f5d" Apr 23 13:33:55.197357 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:55.197323 2576 scope.go:117] "RemoveContainer" containerID="4d76995909f54d6636e3838a1764623a3bf39a05a21ecc28fdfdf99461d24b04" Apr 23 13:33:55.197555 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:55.197536 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-8wz65_openshift-console-operator(bbd5d80a-4c64-4b49-a070-1d5161e7afc9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podUID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" Apr 23 13:33:56.200452 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:56.200432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:33:58.691634 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:58.691593 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:33:58.692058 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:58.691646 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:33:58.694303 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:58.694262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-tls\" (UniqueName: \"kubernetes.io/secret/fa0283a7-9d03-45eb-8654-fca71445e53e-metrics-tls\") pod \"dns-default-678lb\" (UID: \"fa0283a7-9d03-45eb-8654-fca71445e53e\") " pod="openshift-dns/dns-default-678lb" Apr 23 13:33:58.694303 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:58.694289 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/c01c5eac-4623-474c-9b1f-de78e668fd57-cert\") pod \"ingress-canary-cq9bf\" (UID: \"c01c5eac-4623-474c-9b1f-de78e668fd57\") " pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:33:59.001714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.001640 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-ingress-canary\"/\"default-dockercfg-pj4z2\"" Apr 23 13:33:59.004851 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.004832 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress-canary/ingress-canary-cq9bf" Apr 23 13:33:59.113986 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.113958 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress-canary/ingress-canary-cq9bf"] Apr 23 13:33:59.116644 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:33:59.116610 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podc01c5eac_4623_474c_9b1f_de78e668fd57.slice/crio-4af6cbc7b320d21c6bc3ff99b32d6a158c85dcc399a0067dedf00436577c519a WatchSource:0}: Error finding container 4af6cbc7b320d21c6bc3ff99b32d6a158c85dcc399a0067dedf00436577c519a: Status 404 returned error can't find the container with id 4af6cbc7b320d21c6bc3ff99b32d6a158c85dcc399a0067dedf00436577c519a Apr 23 13:33:59.207891 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.207864 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cq9bf" event={"ID":"c01c5eac-4623-474c-9b1f-de78e668fd57","Type":"ContainerStarted","Data":"4af6cbc7b320d21c6bc3ff99b32d6a158c85dcc399a0067dedf00436577c519a"} Apr 23 13:33:59.945303 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.945258 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:59.945303 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.945308 2576 kubelet.go:2658] "SyncLoop (probe)" probe="liveness" status="unhealthy" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:33:59.945796 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:33:59.945665 2576 scope.go:117] "RemoveContainer" containerID="4d76995909f54d6636e3838a1764623a3bf39a05a21ecc28fdfdf99461d24b04" Apr 23 13:33:59.945872 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:33:59.945855 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-8wz65_openshift-console-operator(bbd5d80a-4c64-4b49-a070-1d5161e7afc9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podUID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" Apr 23 13:34:00.508503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.508464 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:34:00.511345 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.511302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-monitoring-operator-tls\" (UniqueName: \"kubernetes.io/secret/12ced166-dcdf-4e6f-9ff5-77d972bf0902-cluster-monitoring-operator-tls\") pod \"cluster-monitoring-operator-75587bd455-kh6vf\" (UID: \"12ced166-dcdf-4e6f-9ff5-77d972bf0902\") " pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:34:00.609014 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.608993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:00.609102 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.609023 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:00.609535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.609518 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/de51ff65-d9f2-40df-b830-2cd1d95fe71e-service-ca-bundle\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:00.611008 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.610992 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/de51ff65-d9f2-40df-b830-2cd1d95fe71e-metrics-certs\") pod \"router-default-7cf45c9b8b-qfgth\" (UID: \"de51ff65-d9f2-40df-b830-2cd1d95fe71e\") " pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:00.709588 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.709555 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:34:00.711745 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.711725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"samples-operator-tls\" (UniqueName: \"kubernetes.io/secret/99d2a07d-bb82-43fb-bc08-69f12aecb492-samples-operator-tls\") pod \"cluster-samples-operator-6dc5bdb6b4-4nbzx\" (UID: \"99d2a07d-bb82-43fb-bc08-69f12aecb492\") " pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:34:00.742155 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.741774 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" Apr 23 13:34:00.843718 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.843693 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:00.854955 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.854933 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf"] Apr 23 13:34:00.857654 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:00.857630 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod12ced166_dcdf_4e6f_9ff5_77d972bf0902.slice/crio-c8453713ffd46765aafa32e2cf254afd9adee962030b237522cfb7501cc52c1c WatchSource:0}: Error finding container c8453713ffd46765aafa32e2cf254afd9adee962030b237522cfb7501cc52c1c: Status 404 returned error can't find the container with id c8453713ffd46765aafa32e2cf254afd9adee962030b237522cfb7501cc52c1c Apr 23 13:34:00.939738 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.939699 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" Apr 23 13:34:00.955251 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:00.955229 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-ingress/router-default-7cf45c9b8b-qfgth"] Apr 23 13:34:00.957019 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:00.956995 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podde51ff65_d9f2_40df_b830_2cd1d95fe71e.slice/crio-df4a07402559214fbc3be2b58203b1339039f25cc30c77c76293ceb2c61b2e78 WatchSource:0}: Error finding container df4a07402559214fbc3be2b58203b1339039f25cc30c77c76293ceb2c61b2e78: Status 404 returned error can't find the container with id df4a07402559214fbc3be2b58203b1339039f25cc30c77c76293ceb2c61b2e78 Apr 23 13:34:01.052492 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.052407 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx"] Apr 23 13:34:01.214023 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.213990 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress-canary/ingress-canary-cq9bf" event={"ID":"c01c5eac-4623-474c-9b1f-de78e668fd57","Type":"ContainerStarted","Data":"005c5f8d6704d4bedc7b88166fe7ec67f9f5c1fa77dd7ad16066bc04a18c7f55"} Apr 23 13:34:01.215127 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.215104 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" event={"ID":"de51ff65-d9f2-40df-b830-2cd1d95fe71e","Type":"ContainerStarted","Data":"151986bad48166ecf7fc5dddde45cd3a8986fe75ea745bae91460650720d8d75"} Apr 23 13:34:01.215239 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.215132 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" event={"ID":"de51ff65-d9f2-40df-b830-2cd1d95fe71e","Type":"ContainerStarted","Data":"df4a07402559214fbc3be2b58203b1339039f25cc30c77c76293ceb2c61b2e78"} Apr 23 13:34:01.216000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.215980 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" event={"ID":"99d2a07d-bb82-43fb-bc08-69f12aecb492","Type":"ContainerStarted","Data":"2fcaccb6a3ef6ac561b7a64e0e2a3df8fdacd7359f9ab347ec585444bfe75ec0"} Apr 23 13:34:01.216854 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.216837 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" event={"ID":"12ced166-dcdf-4e6f-9ff5-77d972bf0902","Type":"ContainerStarted","Data":"c8453713ffd46765aafa32e2cf254afd9adee962030b237522cfb7501cc52c1c"} Apr 23 13:34:01.228870 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.228835 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress-canary/ingress-canary-cq9bf" podStartSLOduration=129.760022517 podStartE2EDuration="2m11.228823661s" podCreationTimestamp="2026-04-23 13:31:50 +0000 UTC" firstStartedPulling="2026-04-23 13:33:59.118471151 +0000 UTC m=+159.866473084" lastFinishedPulling="2026-04-23 13:34:00.58727229 +0000 UTC m=+161.335274228" observedRunningTime="2026-04-23 13:34:01.228253442 +0000 UTC m=+161.976255419" watchObservedRunningTime="2026-04-23 13:34:01.228823661 +0000 UTC m=+161.976825615" Apr 23 13:34:01.246532 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.246492 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" podStartSLOduration=33.246481105 podStartE2EDuration="33.246481105s" podCreationTimestamp="2026-04-23 13:33:28 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:01.246204533 +0000 UTC m=+161.994206498" watchObservedRunningTime="2026-04-23 13:34:01.246481105 +0000 UTC m=+161.994483059" Apr 23 13:34:01.844399 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.844370 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:01.847066 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:01.847034 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:02.224207 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.224121 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:02.225667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.225642 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-ingress/router-default-7cf45c9b8b-qfgth" Apr 23 13:34:02.474059 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.474029 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-insights/insights-runtime-extractor-4dwkv"] Apr 23 13:34:02.480301 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.480270 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.484456 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.484433 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-tls\"" Apr 23 13:34:02.484597 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.484577 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-root-ca.crt\"" Apr 23 13:34:02.485674 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.485650 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"kube-rbac-proxy\"" Apr 23 13:34:02.485674 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.485668 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-insights\"/\"openshift-service-ca.crt\"" Apr 23 13:34:02.485845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.485655 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-insights\"/\"insights-runtime-extractor-sa-dockercfg-qjjwc\"" Apr 23 13:34:02.494640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.494620 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-857b476854-crbr6"] Apr 23 13:34:02.494814 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:02.494796 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="unmounted volumes=[registry-tls], unattached volumes=[], failed to process volumes=[]: context canceled" pod="openshift-image-registry/image-registry-857b476854-crbr6" podUID="b1b028da-d16a-4e08-8ffb-7da1255d3fc8" Apr 23 13:34:02.503864 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.503797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4dwkv"] Apr 23 13:34:02.526413 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.526386 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:34:02.528930 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.528908 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"image-registry-857b476854-crbr6\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:34:02.560807 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.560783 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-image-registry/image-registry-69f68b99b9-xc5cv"] Apr 23 13:34:02.564056 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.564039 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.586374 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.586352 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69f68b99b9-xc5cv"] Apr 23 13:34:02.627568 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.627541 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-47446\" (UniqueName: \"kubernetes.io/projected/433dd8e3-1838-432a-bf89-c07ffb6fef04-kube-api-access-47446\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.627656 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.627573 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/433dd8e3-1838-432a-bf89-c07ffb6fef04-crio-socket\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.627656 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.627607 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/433dd8e3-1838-432a-bf89-c07ffb6fef04-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.627764 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.627666 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/433dd8e3-1838-432a-bf89-c07ffb6fef04-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.627764 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.627688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/433dd8e3-1838-432a-bf89-c07ffb6fef04-data-volume\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728302 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728273 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/433dd8e3-1838-432a-bf89-c07ffb6fef04-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728302 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728306 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/433dd8e3-1838-432a-bf89-c07ffb6fef04-data-volume\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728354 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4pcwj\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-kube-api-access-4pcwj\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728383 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d51efffd-4a46-43d3-a25d-88497f1ec487-image-registry-private-configuration\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728400 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d51efffd-4a46-43d3-a25d-88497f1ec487-installation-pull-secrets\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728417 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d51efffd-4a46-43d3-a25d-88497f1ec487-trusted-ca\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728481 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-47446\" (UniqueName: \"kubernetes.io/projected/433dd8e3-1838-432a-bf89-c07ffb6fef04-kube-api-access-47446\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728524 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/433dd8e3-1838-432a-bf89-c07ffb6fef04-crio-socket\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/433dd8e3-1838-432a-bf89-c07ffb6fef04-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d51efffd-4a46-43d3-a25d-88497f1ec487-registry-certificates\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-bound-sa-token\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728630 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data-volume\" (UniqueName: \"kubernetes.io/empty-dir/433dd8e3-1838-432a-bf89-c07ffb6fef04-data-volume\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728650 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"crio-socket\" (UniqueName: \"kubernetes.io/host-path/433dd8e3-1838-432a-bf89-c07ffb6fef04-crio-socket\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728659 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-registry-tls\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.728748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728703 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d51efffd-4a46-43d3-a25d-88497f1ec487-ca-trust-extracted\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.729091 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.728951 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-rbac-proxy-cm\" (UniqueName: \"kubernetes.io/configmap/433dd8e3-1838-432a-bf89-c07ffb6fef04-kube-rbac-proxy-cm\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.730781 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.730705 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"insights-runtime-extractor-tls\" (UniqueName: \"kubernetes.io/secret/433dd8e3-1838-432a-bf89-c07ffb6fef04-insights-runtime-extractor-tls\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.740254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.740234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-47446\" (UniqueName: \"kubernetes.io/projected/433dd8e3-1838-432a-bf89-c07ffb6fef04-kube-api-access-47446\") pod \"insights-runtime-extractor-4dwkv\" (UID: \"433dd8e3-1838-432a-bf89-c07ffb6fef04\") " pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.792028 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.792008 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-insights/insights-runtime-extractor-4dwkv" Apr 23 13:34:02.829141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829120 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d51efffd-4a46-43d3-a25d-88497f1ec487-image-registry-private-configuration\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d51efffd-4a46-43d3-a25d-88497f1ec487-installation-pull-secrets\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829167 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d51efffd-4a46-43d3-a25d-88497f1ec487-trusted-ca\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d51efffd-4a46-43d3-a25d-88497f1ec487-registry-certificates\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-bound-sa-token\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829490 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829259 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-registry-tls\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829490 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829285 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d51efffd-4a46-43d3-a25d-88497f1ec487-ca-trust-extracted\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829490 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4pcwj\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-kube-api-access-4pcwj\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.829982 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.829958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/d51efffd-4a46-43d3-a25d-88497f1ec487-ca-trust-extracted\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.830078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.830018 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/d51efffd-4a46-43d3-a25d-88497f1ec487-registry-certificates\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.830314 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.830294 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/d51efffd-4a46-43d3-a25d-88497f1ec487-trusted-ca\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.832123 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.832076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-registry-tls\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.832624 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.832582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/d51efffd-4a46-43d3-a25d-88497f1ec487-image-registry-private-configuration\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.833237 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.833201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/d51efffd-4a46-43d3-a25d-88497f1ec487-installation-pull-secrets\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.840889 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.840793 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4pcwj\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-kube-api-access-4pcwj\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.846136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.846076 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/d51efffd-4a46-43d3-a25d-88497f1ec487-bound-sa-token\") pod \"image-registry-69f68b99b9-xc5cv\" (UID: \"d51efffd-4a46-43d3-a25d-88497f1ec487\") " pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.874101 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.873771 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:02.932378 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:02.932186 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-insights/insights-runtime-extractor-4dwkv"] Apr 23 13:34:02.936376 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:02.936319 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod433dd8e3_1838_432a_bf89_c07ffb6fef04.slice/crio-1c6fc2f138dfdea0c4612fbc580636270d664c197376f5f6c8cbac62fbe0db8c WatchSource:0}: Error finding container 1c6fc2f138dfdea0c4612fbc580636270d664c197376f5f6c8cbac62fbe0db8c: Status 404 returned error can't find the container with id 1c6fc2f138dfdea0c4612fbc580636270d664c197376f5f6c8cbac62fbe0db8c Apr 23 13:34:03.036535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.036508 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-image-registry/image-registry-69f68b99b9-xc5cv"] Apr 23 13:34:03.039587 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:03.039552 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd51efffd_4a46_43d3_a25d_88497f1ec487.slice/crio-cf44d9744a873bb082903131a56983346aff6a708b56ca9dd7abc0a4466ae8b4 WatchSource:0}: Error finding container cf44d9744a873bb082903131a56983346aff6a708b56ca9dd7abc0a4466ae8b4: Status 404 returned error can't find the container with id cf44d9744a873bb082903131a56983346aff6a708b56ca9dd7abc0a4466ae8b4 Apr 23 13:34:03.228938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.228897 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" event={"ID":"d51efffd-4a46-43d3-a25d-88497f1ec487","Type":"ContainerStarted","Data":"a2a2de55390125c6831bb1c138d36ca5e75e8440fdbd9de629d93ab0964a5273"} Apr 23 13:34:03.228938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.228941 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" event={"ID":"d51efffd-4a46-43d3-a25d-88497f1ec487","Type":"ContainerStarted","Data":"cf44d9744a873bb082903131a56983346aff6a708b56ca9dd7abc0a4466ae8b4"} Apr 23 13:34:03.229452 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.229001 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:03.230268 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.230242 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4dwkv" event={"ID":"433dd8e3-1838-432a-bf89-c07ffb6fef04","Type":"ContainerStarted","Data":"ed4b2a452179a7ae39b6e0b195c3a5c5d40afee6d7c38b7b0018b4b9f4d9290d"} Apr 23 13:34:03.230379 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.230275 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4dwkv" event={"ID":"433dd8e3-1838-432a-bf89-c07ffb6fef04","Type":"ContainerStarted","Data":"1c6fc2f138dfdea0c4612fbc580636270d664c197376f5f6c8cbac62fbe0db8c"} Apr 23 13:34:03.231850 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.231821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" event={"ID":"99d2a07d-bb82-43fb-bc08-69f12aecb492","Type":"ContainerStarted","Data":"474924ee734298c898c7bc8dcc253120d086a89de58dae0dd2a219929d69f20d"} Apr 23 13:34:03.231973 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.231852 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" event={"ID":"99d2a07d-bb82-43fb-bc08-69f12aecb492","Type":"ContainerStarted","Data":"971923cccfd88e4c3ff17da80fccde8f79477bc76cfa7d64ab6b89a9e3b55346"} Apr 23 13:34:03.233057 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.233036 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" event={"ID":"12ced166-dcdf-4e6f-9ff5-77d972bf0902","Type":"ContainerStarted","Data":"2b29498c5930b8eefecff51bf9e99e352c8ea39b217e3f9ffed9acd275c613f7"} Apr 23 13:34:03.233131 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.233074 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:34:03.237051 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.237034 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:34:03.313283 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.313222 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" podStartSLOduration=1.313204397 podStartE2EDuration="1.313204397s" podCreationTimestamp="2026-04-23 13:34:02 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:03.288791125 +0000 UTC m=+164.036793076" watchObservedRunningTime="2026-04-23 13:34:03.313204397 +0000 UTC m=+164.061206352" Apr 23 13:34:03.314410 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.314363 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/cluster-monitoring-operator-75587bd455-kh6vf" podStartSLOduration=33.374778726 podStartE2EDuration="35.314347767s" podCreationTimestamp="2026-04-23 13:33:28 +0000 UTC" firstStartedPulling="2026-04-23 13:34:00.85938317 +0000 UTC m=+161.607385118" lastFinishedPulling="2026-04-23 13:34:02.79895221 +0000 UTC m=+163.546954159" observedRunningTime="2026-04-23 13:34:03.313355461 +0000 UTC m=+164.061357402" watchObservedRunningTime="2026-04-23 13:34:03.314347767 +0000 UTC m=+164.062349728" Apr 23 13:34:03.331427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331407 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-bound-sa-token\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331454 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331514 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-image-registry-private-configuration\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331635 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331532 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-certificates\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331635 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-ca-trust-extracted\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331635 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331573 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-trusted-ca\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331635 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331615 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lxlq2\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-kube-api-access-lxlq2\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331828 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331640 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-installation-pull-secrets\") pod \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\" (UID: \"b1b028da-d16a-4e08-8ffb-7da1255d3fc8\") " Apr 23 13:34:03.331905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.331879 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-certificates" (OuterVolumeSpecName: "registry-certificates") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "registry-certificates". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:03.332043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.332019 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-trusted-ca" (OuterVolumeSpecName: "trusted-ca") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "trusted-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:34:03.332454 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.332346 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-ca-trust-extracted" (OuterVolumeSpecName: "ca-trust-extracted") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "ca-trust-extracted". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:34:03.332942 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.332909 2576 reconciler_common.go:299] "Volume detached for volume \"registry-certificates\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-certificates\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.333065 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.333052 2576 reconciler_common.go:299] "Volume detached for volume \"ca-trust-extracted\" (UniqueName: \"kubernetes.io/empty-dir/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-ca-trust-extracted\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.333152 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.333141 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca\" (UniqueName: \"kubernetes.io/configmap/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-trusted-ca\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.334077 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.333883 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls" (OuterVolumeSpecName: "registry-tls") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "registry-tls". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:03.334077 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.334032 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-image-registry-private-configuration" (OuterVolumeSpecName: "image-registry-private-configuration") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "image-registry-private-configuration". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:03.334788 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.334763 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-bound-sa-token" (OuterVolumeSpecName: "bound-sa-token") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "bound-sa-token". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:03.335358 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.335313 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-kube-api-access-lxlq2" (OuterVolumeSpecName: "kube-api-access-lxlq2") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "kube-api-access-lxlq2". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:34:03.335834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.335807 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-installation-pull-secrets" (OuterVolumeSpecName: "installation-pull-secrets") pod "b1b028da-d16a-4e08-8ffb-7da1255d3fc8" (UID: "b1b028da-d16a-4e08-8ffb-7da1255d3fc8"). InnerVolumeSpecName "installation-pull-secrets". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:34:03.340145 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.340084 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-cluster-samples-operator/cluster-samples-operator-6dc5bdb6b4-4nbzx" podStartSLOduration=33.633422372 podStartE2EDuration="35.340073887s" podCreationTimestamp="2026-04-23 13:33:28 +0000 UTC" firstStartedPulling="2026-04-23 13:34:01.093664206 +0000 UTC m=+161.841666138" lastFinishedPulling="2026-04-23 13:34:02.800315716 +0000 UTC m=+163.548317653" observedRunningTime="2026-04-23 13:34:03.339076928 +0000 UTC m=+164.087078882" watchObservedRunningTime="2026-04-23 13:34:03.340073887 +0000 UTC m=+164.088075841" Apr 23 13:34:03.433877 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.433842 2576 reconciler_common.go:299] "Volume detached for volume \"image-registry-private-configuration\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-image-registry-private-configuration\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.433877 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.433874 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lxlq2\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-kube-api-access-lxlq2\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.434102 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.433890 2576 reconciler_common.go:299] "Volume detached for volume \"installation-pull-secrets\" (UniqueName: \"kubernetes.io/secret/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-installation-pull-secrets\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.434102 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.433904 2576 reconciler_common.go:299] "Volume detached for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-bound-sa-token\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:03.434102 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:03.433913 2576 reconciler_common.go:299] "Volume detached for volume \"registry-tls\" (UniqueName: \"kubernetes.io/projected/b1b028da-d16a-4e08-8ffb-7da1255d3fc8-registry-tls\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:34:04.237775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:04.237679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4dwkv" event={"ID":"433dd8e3-1838-432a-bf89-c07ffb6fef04","Type":"ContainerStarted","Data":"fe04a7894f16717801f3f0f08b7ff68af6135af43182ad20c9e18e533973f6ae"} Apr 23 13:34:04.237775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:04.237740 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-image-registry/image-registry-857b476854-crbr6" Apr 23 13:34:04.281290 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:04.281263 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-image-registry/image-registry-857b476854-crbr6"] Apr 23 13:34:04.285648 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:04.285627 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-image-registry/image-registry-857b476854-crbr6"] Apr 23 13:34:05.242295 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:05.242264 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-insights/insights-runtime-extractor-4dwkv" event={"ID":"433dd8e3-1838-432a-bf89-c07ffb6fef04","Type":"ContainerStarted","Data":"587fd1163c8482ff0084979454715d9098882fa8b3ed87d7179e91b408045cb7"} Apr 23 13:34:05.260247 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:05.260207 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-insights/insights-runtime-extractor-4dwkv" podStartSLOduration=1.16895444 podStartE2EDuration="3.260190955s" podCreationTimestamp="2026-04-23 13:34:02 +0000 UTC" firstStartedPulling="2026-04-23 13:34:03.003298348 +0000 UTC m=+163.751300281" lastFinishedPulling="2026-04-23 13:34:05.094534859 +0000 UTC m=+165.842536796" observedRunningTime="2026-04-23 13:34:05.259093899 +0000 UTC m=+166.007095870" watchObservedRunningTime="2026-04-23 13:34:05.260190955 +0000 UTC m=+166.008192911" Apr 23 13:34:05.824653 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:05.824624 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="b1b028da-d16a-4e08-8ffb-7da1255d3fc8" path="/var/lib/kubelet/pods/b1b028da-d16a-4e08-8ffb-7da1255d3fc8/volumes" Apr 23 13:34:06.820659 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:06.820625 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:34:07.557165 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.557129 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-w58g9"] Apr 23 13:34:07.562031 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.561744 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.564600 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.564581 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-kube-rbac-proxy-config\"" Apr 23 13:34:07.564883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.564850 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-tls\"" Apr 23 13:34:07.565681 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.565668 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"prometheus-operator-dockercfg-vtflt\"" Apr 23 13:34:07.565897 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.565882 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-client-ca\"" Apr 23 13:34:07.567610 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.567574 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-w58g9"] Apr 23 13:34:07.662088 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.662064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.662190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.662099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.662190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.662120 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f358a06-d61f-4f99-8778-a8ede80db2a5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.662273 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.662201 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-f5c4v\" (UniqueName: \"kubernetes.io/projected/0f358a06-d61f-4f99-8778-a8ede80db2a5-kube-api-access-f5c4v\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.763010 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.762982 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-f5c4v\" (UniqueName: \"kubernetes.io/projected/0f358a06-d61f-4f99-8778-a8ede80db2a5-kube-api-access-f5c4v\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.763106 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.763033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.763106 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.763069 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.763200 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.763102 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f358a06-d61f-4f99-8778-a8ede80db2a5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.763200 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:07.763184 2576 secret.go:189] Couldn't get secret openshift-monitoring/prometheus-operator-tls: secret "prometheus-operator-tls" not found Apr 23 13:34:07.763287 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:07.763233 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-tls podName:0f358a06-d61f-4f99-8778-a8ede80db2a5 nodeName:}" failed. No retries permitted until 2026-04-23 13:34:08.26322067 +0000 UTC m=+169.011222603 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "prometheus-operator-tls" (UniqueName: "kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-tls") pod "prometheus-operator-5676c8c784-w58g9" (UID: "0f358a06-d61f-4f99-8778-a8ede80db2a5") : secret "prometheus-operator-tls" not found Apr 23 13:34:07.763721 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.763702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0f358a06-d61f-4f99-8778-a8ede80db2a5-metrics-client-ca\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.765219 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.765197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-kube-rbac-proxy-config\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.772602 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.772582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-f5c4v\" (UniqueName: \"kubernetes.io/projected/0f358a06-d61f-4f99-8778-a8ede80db2a5-kube-api-access-f5c4v\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:07.820710 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.820660 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-678lb" Apr 23 13:34:07.823372 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.823353 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-dns\"/\"dns-dockercfg-gtr2j\"" Apr 23 13:34:07.831586 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.831572 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-dns/dns-default-678lb" Apr 23 13:34:07.954603 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:07.954576 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-dns/dns-default-678lb"] Apr 23 13:34:07.957369 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:07.957343 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfa0283a7_9d03_45eb_8654_fca71445e53e.slice/crio-d6119f57e7b465e55403856ddf625e0cf65e7cba8140dda44e4f4ab0e35d3260 WatchSource:0}: Error finding container d6119f57e7b465e55403856ddf625e0cf65e7cba8140dda44e4f4ab0e35d3260: Status 404 returned error can't find the container with id d6119f57e7b465e55403856ddf625e0cf65e7cba8140dda44e4f4ab0e35d3260 Apr 23 13:34:08.250867 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:08.250789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-678lb" event={"ID":"fa0283a7-9d03-45eb-8654-fca71445e53e","Type":"ContainerStarted","Data":"d6119f57e7b465e55403856ddf625e0cf65e7cba8140dda44e4f4ab0e35d3260"} Apr 23 13:34:08.266168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:08.266144 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:08.268338 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:08.268309 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"prometheus-operator-tls\" (UniqueName: \"kubernetes.io/secret/0f358a06-d61f-4f99-8778-a8ede80db2a5-prometheus-operator-tls\") pod \"prometheus-operator-5676c8c784-w58g9\" (UID: \"0f358a06-d61f-4f99-8778-a8ede80db2a5\") " pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:08.471424 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:08.471397 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" Apr 23 13:34:08.592067 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:08.592038 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/prometheus-operator-5676c8c784-w58g9"] Apr 23 13:34:08.595589 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:08.595556 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0f358a06_d61f_4f99_8778_a8ede80db2a5.slice/crio-cf3ee3baa16798e16a7615180f965b59d6088dcd1deabff5af768ab250ab4972 WatchSource:0}: Error finding container cf3ee3baa16798e16a7615180f965b59d6088dcd1deabff5af768ab250ab4972: Status 404 returned error can't find the container with id cf3ee3baa16798e16a7615180f965b59d6088dcd1deabff5af768ab250ab4972 Apr 23 13:34:09.254685 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:09.254646 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" event={"ID":"0f358a06-d61f-4f99-8778-a8ede80db2a5","Type":"ContainerStarted","Data":"cf3ee3baa16798e16a7615180f965b59d6088dcd1deabff5af768ab250ab4972"} Apr 23 13:34:10.259392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.259354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" event={"ID":"0f358a06-d61f-4f99-8778-a8ede80db2a5","Type":"ContainerStarted","Data":"d85db7e7f71f282ad02292fa2ec61d82a2d15ae129168f1425d3f486d1c92165"} Apr 23 13:34:10.259392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.259394 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" event={"ID":"0f358a06-d61f-4f99-8778-a8ede80db2a5","Type":"ContainerStarted","Data":"4ac1d9a4e91b309292eb81c782028e4a411ef545dce4cb0f32df27c39e7258b7"} Apr 23 13:34:10.260909 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.260890 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-678lb" event={"ID":"fa0283a7-9d03-45eb-8654-fca71445e53e","Type":"ContainerStarted","Data":"bb38c682a6c376029d3c89e57d2f98439505b9ed5b4040841dde395270ca29f5"} Apr 23 13:34:10.260958 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.260916 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-dns/dns-default-678lb" event={"ID":"fa0283a7-9d03-45eb-8654-fca71445e53e","Type":"ContainerStarted","Data":"dd600f16340b26425955053f9d10bacae6fd00c2b90e24a0843f31f0eeb3ce24"} Apr 23 13:34:10.261012 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.261002 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-dns/dns-default-678lb" Apr 23 13:34:10.276529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.276488 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/prometheus-operator-5676c8c784-w58g9" podStartSLOduration=2.06995421 podStartE2EDuration="3.276474433s" podCreationTimestamp="2026-04-23 13:34:07 +0000 UTC" firstStartedPulling="2026-04-23 13:34:08.597671044 +0000 UTC m=+169.345672978" lastFinishedPulling="2026-04-23 13:34:09.804191264 +0000 UTC m=+170.552193201" observedRunningTime="2026-04-23 13:34:10.275669457 +0000 UTC m=+171.023671412" watchObservedRunningTime="2026-04-23 13:34:10.276474433 +0000 UTC m=+171.024476385" Apr 23 13:34:10.292226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:10.292188 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-dns/dns-default-678lb" podStartSLOduration=138.856175086 podStartE2EDuration="2m20.292176995s" podCreationTimestamp="2026-04-23 13:31:50 +0000 UTC" firstStartedPulling="2026-04-23 13:34:07.959142421 +0000 UTC m=+168.707144355" lastFinishedPulling="2026-04-23 13:34:09.395144328 +0000 UTC m=+170.143146264" observedRunningTime="2026-04-23 13:34:10.291454924 +0000 UTC m=+171.039456877" watchObservedRunningTime="2026-04-23 13:34:10.292176995 +0000 UTC m=+171.040178950" Apr 23 13:34:11.932174 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.932138 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/node-exporter-pcpnf"] Apr 23 13:34:11.935555 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.935533 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.938183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.938162 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-tls\"" Apr 23 13:34:11.938472 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.938454 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-kube-rbac-proxy-config\"" Apr 23 13:34:11.938624 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.938606 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"node-exporter-accelerators-collector-config\"" Apr 23 13:34:11.938704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.938653 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"node-exporter-dockercfg-d2mkv\"" Apr 23 13:34:11.995304 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-wtmp\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995426 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-tls\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995426 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995368 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-metrics-client-ca\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995426 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-textfile\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995426 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995409 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-sys\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995438 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-accelerators-collector-config\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995470 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-hwdgj\" (UniqueName: \"kubernetes.io/projected/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-kube-api-access-hwdgj\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995497 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-root\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:11.995609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:11.995529 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-wtmp\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096786 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096682 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-tls\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096786 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096730 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-metrics-client-ca\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096786 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096758 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-textfile\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-sys\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096819 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-accelerators-collector-config\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096824 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-wtmp\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-wtmp\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096847 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-hwdgj\" (UniqueName: \"kubernetes.io/projected/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-kube-api-access-hwdgj\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.096921 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:12.096872 2576 secret.go:189] Couldn't get secret openshift-monitoring/node-exporter-tls: secret "node-exporter-tls" not found Apr 23 13:34:12.096921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096916 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"root\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-root\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.097188 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:12.096935 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-tls podName:00adbaa1-560d-44f6-bc21-2fbd5b8b655e nodeName:}" failed. No retries permitted until 2026-04-23 13:34:12.59691568 +0000 UTC m=+173.344917618 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "node-exporter-tls" (UniqueName: "kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-tls") pod "node-exporter-pcpnf" (UID: "00adbaa1-560d-44f6-bc21-2fbd5b8b655e") : secret "node-exporter-tls" not found Apr 23 13:34:12.097188 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096873 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"root\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-root\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.097188 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.096993 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.097188 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.097174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-textfile\" (UniqueName: \"kubernetes.io/empty-dir/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-textfile\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.097362 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.097231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-sys\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.097362 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.097347 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-metrics-client-ca\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.097682 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.097663 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-accelerators-collector-config\" (UniqueName: \"kubernetes.io/configmap/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-accelerators-collector-config\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.099142 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.099122 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-kube-rbac-proxy-config\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.106284 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.106263 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-hwdgj\" (UniqueName: \"kubernetes.io/projected/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-kube-api-access-hwdgj\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.599782 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.599754 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-tls\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.601873 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.601856 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"node-exporter-tls\" (UniqueName: \"kubernetes.io/secret/00adbaa1-560d-44f6-bc21-2fbd5b8b655e-node-exporter-tls\") pod \"node-exporter-pcpnf\" (UID: \"00adbaa1-560d-44f6-bc21-2fbd5b8b655e\") " pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.821573 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.821553 2576 scope.go:117] "RemoveContainer" containerID="4d76995909f54d6636e3838a1764623a3bf39a05a21ecc28fdfdf99461d24b04" Apr 23 13:34:12.821729 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:12.821713 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"console-operator\" with CrashLoopBackOff: \"back-off 20s restarting failed container=console-operator pod=console-operator-9d4b6777b-8wz65_openshift-console-operator(bbd5d80a-4c64-4b49-a070-1d5161e7afc9)\"" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podUID="bbd5d80a-4c64-4b49-a070-1d5161e7afc9" Apr 23 13:34:12.846744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.846720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/node-exporter-pcpnf" Apr 23 13:34:12.855675 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:12.855649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00adbaa1_560d_44f6_bc21_2fbd5b8b655e.slice/crio-c0871fb0cc78aa77f11da1e4fa8f3bbc64edf400684f3c6a3c1527e8c81551ec WatchSource:0}: Error finding container c0871fb0cc78aa77f11da1e4fa8f3bbc64edf400684f3c6a3c1527e8c81551ec: Status 404 returned error can't find the container with id c0871fb0cc78aa77f11da1e4fa8f3bbc64edf400684f3c6a3c1527e8c81551ec Apr 23 13:34:12.982166 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.982096 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:12.987779 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.987757 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:12.990489 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990468 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:34:12.990489 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990479 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:34:12.990640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990489 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:34:12.990640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990484 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:34:12.990640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:34:12.990838 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990806 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:34:12.991022 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.990999 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:34:12.991104 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.991079 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:34:12.991177 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.991160 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:34:12.991262 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.991249 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-jf6mr\"" Apr 23 13:34:12.998915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:12.998897 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:13.003115 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003094 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-out\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003203 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003203 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003203 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003205 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003227 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003256 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-web-config\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003280 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-volume\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003302 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-r2zvm\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-kube-api-access-r2zvm\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003363 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003646 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003390 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003646 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003411 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.003646 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.003445 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104149 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104192 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104216 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-out\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104496 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104236 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104496 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104264 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104496 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:34:13.104322 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle podName:0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef nodeName:}" failed. No retries permitted until 2026-04-23 13:34:13.604300584 +0000 UTC m=+174.352302533 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "alertmanager-trusted-ca-bundle" (UniqueName: "kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle") pod "alertmanager-main-0" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef") : configmap references non-existent config key: ca-bundle.crt Apr 23 13:34:13.104663 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104613 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104768 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104668 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104768 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104768 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104748 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-web-config\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104773 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-volume\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104797 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-r2zvm\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-kube-api-access-r2zvm\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104846 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.104906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104890 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.105087 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.104954 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.105626 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.105603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.107175 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.107129 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-out\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.107710 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.107683 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.108553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.108455 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.108553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.108511 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-volume\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.108553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.108535 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.108771 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.108672 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-tls-assets\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.108771 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.108713 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.109577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.109556 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.109839 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.109821 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-web-config\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.113172 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.113152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-r2zvm\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-kube-api-access-r2zvm\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.271842 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.271817 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pcpnf" event={"ID":"00adbaa1-560d-44f6-bc21-2fbd5b8b655e","Type":"ContainerStarted","Data":"c0871fb0cc78aa77f11da1e4fa8f3bbc64edf400684f3c6a3c1527e8c81551ec"} Apr 23 13:34:13.607263 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.607233 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.608103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.608083 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:13.896776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:13.896752 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:34:14.023943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:14.023920 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:34:14.025405 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:14.025377 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0c944e1b_5a9c_43e7_9d12_58a2eda6c7ef.slice/crio-e56be08d179b3d659c8db8ec617aed6825833bda5ced3edb0ff574c5705eb615 WatchSource:0}: Error finding container e56be08d179b3d659c8db8ec617aed6825833bda5ced3edb0ff574c5705eb615: Status 404 returned error can't find the container with id e56be08d179b3d659c8db8ec617aed6825833bda5ced3edb0ff574c5705eb615 Apr 23 13:34:14.275531 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:14.275471 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"e56be08d179b3d659c8db8ec617aed6825833bda5ced3edb0ff574c5705eb615"} Apr 23 13:34:14.276933 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:14.276903 2576 generic.go:358] "Generic (PLEG): container finished" podID="00adbaa1-560d-44f6-bc21-2fbd5b8b655e" containerID="6376bcdf35c0cc15c7b9166859fa85e3ca6ffcfbcaf004362ea1a3ee9912665f" exitCode=0 Apr 23 13:34:14.277039 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:14.276943 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pcpnf" event={"ID":"00adbaa1-560d-44f6-bc21-2fbd5b8b655e","Type":"ContainerDied","Data":"6376bcdf35c0cc15c7b9166859fa85e3ca6ffcfbcaf004362ea1a3ee9912665f"} Apr 23 13:34:15.282104 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:15.282072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pcpnf" event={"ID":"00adbaa1-560d-44f6-bc21-2fbd5b8b655e","Type":"ContainerStarted","Data":"c588abb5bff6a9a26651ac8996e332252c7140a5a19a316486105af17780bc06"} Apr 23 13:34:15.282482 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:15.282109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/node-exporter-pcpnf" event={"ID":"00adbaa1-560d-44f6-bc21-2fbd5b8b655e","Type":"ContainerStarted","Data":"4a1481b77097a7c91f8faea9552c22551cc4d4e3060f8fcccacdd5c1b5df089e"} Apr 23 13:34:15.283475 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:15.283452 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1" exitCode=0 Apr 23 13:34:15.283587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:15.283486 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1"} Apr 23 13:34:15.302141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:15.302104 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/node-exporter-pcpnf" podStartSLOduration=3.46926744 podStartE2EDuration="4.302092744s" podCreationTimestamp="2026-04-23 13:34:11 +0000 UTC" firstStartedPulling="2026-04-23 13:34:12.857631621 +0000 UTC m=+173.605633558" lastFinishedPulling="2026-04-23 13:34:13.690456923 +0000 UTC m=+174.438458862" observedRunningTime="2026-04-23 13:34:15.29998793 +0000 UTC m=+176.047989884" watchObservedRunningTime="2026-04-23 13:34:15.302092744 +0000 UTC m=+176.050094698" Apr 23 13:34:16.206106 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.206073 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/metrics-server-7f65b95986-th9fz"] Apr 23 13:34:16.209145 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.209124 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.211889 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.211864 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"metrics-server-audit-profiles\"" Apr 23 13:34:16.211999 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.211899 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-tls\"" Apr 23 13:34:16.213317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.213177 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-client-certs\"" Apr 23 13:34:16.213317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.213189 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"kubelet-serving-ca-bundle\"" Apr 23 13:34:16.213317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.213220 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-7fvvkgc6cs8en\"" Apr 23 13:34:16.213317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.213237 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"metrics-server-dockercfg-x5725\"" Apr 23 13:34:16.220523 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.220501 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f65b95986-th9fz"] Apr 23 13:34:16.226985 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.226961 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fc39319f-3709-42ba-822c-fe086f71c769-metrics-server-audit-profiles\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.227084 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.227008 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc39319f-3709-42ba-822c-fe086f71c769-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.227084 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.227037 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fc39319f-3709-42ba-822c-fe086f71c769-audit-log\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.227084 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.227073 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-secret-metrics-server-client-certs\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.227242 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.227230 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-secret-metrics-server-tls\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.227300 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.227261 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-client-ca-bundle\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.227370 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.227295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-9r48x\" (UniqueName: \"kubernetes.io/projected/fc39319f-3709-42ba-822c-fe086f71c769-kube-api-access-9r48x\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.327822 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.327788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-secret-metrics-server-tls\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.327827 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-client-ca-bundle\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.327868 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-9r48x\" (UniqueName: \"kubernetes.io/projected/fc39319f-3709-42ba-822c-fe086f71c769-kube-api-access-9r48x\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.327917 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fc39319f-3709-42ba-822c-fe086f71c769-metrics-server-audit-profiles\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.327961 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc39319f-3709-42ba-822c-fe086f71c769-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.327988 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fc39319f-3709-42ba-822c-fe086f71c769-audit-log\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328199 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.328016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-secret-metrics-server-client-certs\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328937 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.328875 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"audit-log\" (UniqueName: \"kubernetes.io/empty-dir/fc39319f-3709-42ba-822c-fe086f71c769-audit-log\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.328937 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.328930 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"configmap-kubelet-serving-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/fc39319f-3709-42ba-822c-fe086f71c769-configmap-kubelet-serving-ca-bundle\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.329184 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.329142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-server-audit-profiles\" (UniqueName: \"kubernetes.io/configmap/fc39319f-3709-42ba-822c-fe086f71c769-metrics-server-audit-profiles\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.330570 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.330552 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-client-certs\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-secret-metrics-server-client-certs\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.330850 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.330831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"client-ca-bundle\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-client-ca-bundle\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.330911 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.330831 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-metrics-server-tls\" (UniqueName: \"kubernetes.io/secret/fc39319f-3709-42ba-822c-fe086f71c769-secret-metrics-server-tls\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.335835 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.335817 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-9r48x\" (UniqueName: \"kubernetes.io/projected/fc39319f-3709-42ba-822c-fe086f71c769-kube-api-access-9r48x\") pod \"metrics-server-7f65b95986-th9fz\" (UID: \"fc39319f-3709-42ba-822c-fe086f71c769\") " pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.520381 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.520294 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:16.660639 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:16.660606 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/metrics-server-7f65b95986-th9fz"] Apr 23 13:34:16.994983 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:16.994952 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podfc39319f_3709_42ba_822c_fe086f71c769.slice/crio-7f69ff35f0457e0cc2c2becfe52e40a448bbba715d668a81bae0493364ebc0b8 WatchSource:0}: Error finding container 7f69ff35f0457e0cc2c2becfe52e40a448bbba715d668a81bae0493364ebc0b8: Status 404 returned error can't find the container with id 7f69ff35f0457e0cc2c2becfe52e40a448bbba715d668a81bae0493364ebc0b8 Apr 23 13:34:17.290239 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:17.290215 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" event={"ID":"fc39319f-3709-42ba-822c-fe086f71c769","Type":"ContainerStarted","Data":"7f69ff35f0457e0cc2c2becfe52e40a448bbba715d668a81bae0493364ebc0b8"} Apr 23 13:34:17.292340 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:17.292307 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1"} Apr 23 13:34:17.292427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:17.292349 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690"} Apr 23 13:34:17.292427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:17.292359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324"} Apr 23 13:34:17.292427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:17.292367 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa"} Apr 23 13:34:18.299606 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:18.299568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a"} Apr 23 13:34:19.303712 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:19.303679 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" event={"ID":"fc39319f-3709-42ba-822c-fe086f71c769","Type":"ContainerStarted","Data":"d65912dedd21792e513f43b53a6b67de7882f92f214d130e1875b773680d24d9"} Apr 23 13:34:19.306547 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:19.306521 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerStarted","Data":"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f"} Apr 23 13:34:19.320522 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:19.320482 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" podStartSLOduration=1.649666807 podStartE2EDuration="3.320470892s" podCreationTimestamp="2026-04-23 13:34:16 +0000 UTC" firstStartedPulling="2026-04-23 13:34:16.996736877 +0000 UTC m=+177.744738810" lastFinishedPulling="2026-04-23 13:34:18.667540962 +0000 UTC m=+179.415542895" observedRunningTime="2026-04-23 13:34:19.32026941 +0000 UTC m=+180.068271387" watchObservedRunningTime="2026-04-23 13:34:19.320470892 +0000 UTC m=+180.068472847" Apr 23 13:34:19.348197 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:19.348162 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.710152913 podStartE2EDuration="7.348151944s" podCreationTimestamp="2026-04-23 13:34:12 +0000 UTC" firstStartedPulling="2026-04-23 13:34:14.027203972 +0000 UTC m=+174.775205906" lastFinishedPulling="2026-04-23 13:34:18.665203001 +0000 UTC m=+179.413204937" observedRunningTime="2026-04-23 13:34:19.346847462 +0000 UTC m=+180.094849417" watchObservedRunningTime="2026-04-23 13:34:19.348151944 +0000 UTC m=+180.096153904" Apr 23 13:34:20.267554 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:20.267526 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-dns/dns-default-678lb" Apr 23 13:34:22.878399 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:22.878364 2576 patch_prober.go:28] interesting pod/image-registry-69f68b99b9-xc5cv container/registry namespace/openshift-image-registry: Liveness probe status=failure output="HTTP probe failed with statuscode: 503" start-of-body={"errors":[{"code":"UNAVAILABLE","message":"service unavailable","detail":"health check failed: please see /debug/health"}]} Apr 23 13:34:22.878749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:22.878420 2576 prober.go:120] "Probe failed" probeType="Liveness" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" podUID="d51efffd-4a46-43d3-a25d-88497f1ec487" containerName="registry" probeResult="failure" output="HTTP probe failed with statuscode: 503" Apr 23 13:34:24.241894 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:24.241863 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-image-registry/image-registry-69f68b99b9-xc5cv" Apr 23 13:34:27.820825 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:27.820796 2576 scope.go:117] "RemoveContainer" containerID="4d76995909f54d6636e3838a1764623a3bf39a05a21ecc28fdfdf99461d24b04" Apr 23 13:34:28.331553 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.331528 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:34:28.331725 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.331597 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" event={"ID":"bbd5d80a-4c64-4b49-a070-1d5161e7afc9","Type":"ContainerStarted","Data":"fe7fd2b4737a13300554f8ff33ec5191b540d17e37b6eb98d7fedc60a69945ce"} Apr 23 13:34:28.331909 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.331890 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:34:28.352049 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.352013 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" podStartSLOduration=57.813510545 podStartE2EDuration="59.352003686s" podCreationTimestamp="2026-04-23 13:33:29 +0000 UTC" firstStartedPulling="2026-04-23 13:33:30.071004163 +0000 UTC m=+130.819006108" lastFinishedPulling="2026-04-23 13:33:31.609497316 +0000 UTC m=+132.357499249" observedRunningTime="2026-04-23 13:34:28.351932562 +0000 UTC m=+189.099934519" watchObservedRunningTime="2026-04-23 13:34:28.352003686 +0000 UTC m=+189.100005640" Apr 23 13:34:28.683735 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.683663 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console-operator/console-operator-9d4b6777b-8wz65" Apr 23 13:34:28.852458 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.852426 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/downloads-6bcc868b7-rtxwf"] Apr 23 13:34:28.855479 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.855459 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:28.858407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.858387 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"openshift-service-ca.crt\"" Apr 23 13:34:28.858747 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.858728 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"default-dockercfg-54h7k\"" Apr 23 13:34:28.858890 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.858876 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"kube-root-ca.crt\"" Apr 23 13:34:28.867184 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.867158 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rtxwf"] Apr 23 13:34:28.925625 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:28.925596 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-p9wt8\" (UniqueName: \"kubernetes.io/projected/42df0b03-5714-4dda-b296-740fe29fb69f-kube-api-access-p9wt8\") pod \"downloads-6bcc868b7-rtxwf\" (UID: \"42df0b03-5714-4dda-b296-740fe29fb69f\") " pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:29.026435 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:29.026372 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-p9wt8\" (UniqueName: \"kubernetes.io/projected/42df0b03-5714-4dda-b296-740fe29fb69f-kube-api-access-p9wt8\") pod \"downloads-6bcc868b7-rtxwf\" (UID: \"42df0b03-5714-4dda-b296-740fe29fb69f\") " pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:29.035219 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:29.035201 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-p9wt8\" (UniqueName: \"kubernetes.io/projected/42df0b03-5714-4dda-b296-740fe29fb69f-kube-api-access-p9wt8\") pod \"downloads-6bcc868b7-rtxwf\" (UID: \"42df0b03-5714-4dda-b296-740fe29fb69f\") " pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:29.164255 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:29.164224 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:29.283340 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:29.283305 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/downloads-6bcc868b7-rtxwf"] Apr 23 13:34:29.286145 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:29.286121 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod42df0b03_5714_4dda_b296_740fe29fb69f.slice/crio-32651f559dd6587944fd0252e42afb2ef4b75ea602afc2f47808c57457f033bb WatchSource:0}: Error finding container 32651f559dd6587944fd0252e42afb2ef4b75ea602afc2f47808c57457f033bb: Status 404 returned error can't find the container with id 32651f559dd6587944fd0252e42afb2ef4b75ea602afc2f47808c57457f033bb Apr 23 13:34:29.334998 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:29.334969 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rtxwf" event={"ID":"42df0b03-5714-4dda-b296-740fe29fb69f","Type":"ContainerStarted","Data":"32651f559dd6587944fd0252e42afb2ef4b75ea602afc2f47808c57457f033bb"} Apr 23 13:34:36.520517 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:36.520487 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:36.520517 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:36.520523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:38.785119 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.785090 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-7867955cd8-qm669"] Apr 23 13:34:38.790124 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.790100 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:38.792925 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.792905 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-oauth-config\"" Apr 23 13:34:38.793040 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.792927 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"service-ca\"" Apr 23 13:34:38.793937 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.793914 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-serving-cert\"" Apr 23 13:34:38.794053 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.793940 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-console\"/\"console-dockercfg-j9hh7\"" Apr 23 13:34:38.794357 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.794320 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"console-config\"" Apr 23 13:34:38.794463 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.794323 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"oauth-serving-cert\"" Apr 23 13:34:38.799209 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.799191 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7867955cd8-qm669"] Apr 23 13:34:38.916620 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.916591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-oauth-serving-cert\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:38.916620 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.916625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-config\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:38.916828 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.916656 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-serving-cert\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:38.916828 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.916742 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-oauth-config\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:38.916828 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.916777 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-service-ca\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:38.916828 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:38.916805 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-mz9zc\" (UniqueName: \"kubernetes.io/projected/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-kube-api-access-mz9zc\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.017834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.017782 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-oauth-serving-cert\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.017834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.017828 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-config\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.018120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.017866 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-serving-cert\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.018120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.017905 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-oauth-config\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.018120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.017934 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-service-ca\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.018120 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.017968 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-mz9zc\" (UniqueName: \"kubernetes.io/projected/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-kube-api-access-mz9zc\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.018610 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.018582 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-oauth-serving-cert\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.018824 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.018788 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-service-ca\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.019124 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.019104 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-config\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.020504 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.020481 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-oauth-config\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.021175 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.021154 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-serving-cert\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.033152 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.033133 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-mz9zc\" (UniqueName: \"kubernetes.io/projected/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-kube-api-access-mz9zc\") pod \"console-7867955cd8-qm669\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:39.101772 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:39.101709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:44.479609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:44.479582 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-7867955cd8-qm669"] Apr 23 13:34:44.481780 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:44.481745 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod0bc4dfa7_dda2_49e5_b65b_280c78dceb55.slice/crio-d74078354512863d4c1f41acf2cfc770de5068d220bab3a5cdf6caf0fa327d53 WatchSource:0}: Error finding container d74078354512863d4c1f41acf2cfc770de5068d220bab3a5cdf6caf0fa327d53: Status 404 returned error can't find the container with id d74078354512863d4c1f41acf2cfc770de5068d220bab3a5cdf6caf0fa327d53 Apr 23 13:34:45.383540 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:45.383279 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/downloads-6bcc868b7-rtxwf" event={"ID":"42df0b03-5714-4dda-b296-740fe29fb69f","Type":"ContainerStarted","Data":"0aaa622c4d59f5bc7373526d420931c55737560c567ae8bd2693d1fd54982f72"} Apr 23 13:34:45.383741 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:45.383633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:45.385152 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:45.385093 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7867955cd8-qm669" event={"ID":"0bc4dfa7-dda2-49e5-b65b-280c78dceb55","Type":"ContainerStarted","Data":"d74078354512863d4c1f41acf2cfc770de5068d220bab3a5cdf6caf0fa327d53"} Apr 23 13:34:45.402026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:45.401975 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/downloads-6bcc868b7-rtxwf" podStartSLOduration=2.231272121 podStartE2EDuration="17.401952996s" podCreationTimestamp="2026-04-23 13:34:28 +0000 UTC" firstStartedPulling="2026-04-23 13:34:29.288083464 +0000 UTC m=+190.036085397" lastFinishedPulling="2026-04-23 13:34:44.458764335 +0000 UTC m=+205.206766272" observedRunningTime="2026-04-23 13:34:45.400227846 +0000 UTC m=+206.148229803" watchObservedRunningTime="2026-04-23 13:34:45.401952996 +0000 UTC m=+206.149954990" Apr 23 13:34:45.403497 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:45.403464 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/downloads-6bcc868b7-rtxwf" Apr 23 13:34:47.970485 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:47.970456 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-58849bf947-psjnv"] Apr 23 13:34:47.987534 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:47.987508 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58849bf947-psjnv"] Apr 23 13:34:47.987634 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:47.987627 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:47.995401 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:47.995246 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-console\"/\"trusted-ca-bundle\"" Apr 23 13:34:48.102092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102067 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-serving-cert\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.102205 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102124 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-oauth-serving-cert\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.102259 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102207 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-dkb9w\" (UniqueName: \"kubernetes.io/projected/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-kube-api-access-dkb9w\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.102306 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102267 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-service-ca\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.102306 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-config\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.102424 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-trusted-ca-bundle\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.102468 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.102454 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-oauth-config\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203516 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203482 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-service-ca\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203537 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-config\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203562 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-trusted-ca-bundle\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203602 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-oauth-config\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-serving-cert\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203866 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203688 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-oauth-serving-cert\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.203866 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.203731 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-dkb9w\" (UniqueName: \"kubernetes.io/projected/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-kube-api-access-dkb9w\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.204317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.204264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-config\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.204317 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.204264 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-service-ca\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.204507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.204371 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-oauth-serving-cert\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.204816 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.204797 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-trusted-ca-bundle\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.206475 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.206452 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-oauth-config\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.206588 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.206569 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-serving-cert\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.213065 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.213046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-dkb9w\" (UniqueName: \"kubernetes.io/projected/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-kube-api-access-dkb9w\") pod \"console-58849bf947-psjnv\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.298774 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.298745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:48.396957 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.396663 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7867955cd8-qm669" event={"ID":"0bc4dfa7-dda2-49e5-b65b-280c78dceb55","Type":"ContainerStarted","Data":"fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b"} Apr 23 13:34:48.417921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.417842 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-7867955cd8-qm669" podStartSLOduration=6.9226780439999995 podStartE2EDuration="10.41782311s" podCreationTimestamp="2026-04-23 13:34:38 +0000 UTC" firstStartedPulling="2026-04-23 13:34:44.483490092 +0000 UTC m=+205.231492025" lastFinishedPulling="2026-04-23 13:34:47.978635155 +0000 UTC m=+208.726637091" observedRunningTime="2026-04-23 13:34:48.416526006 +0000 UTC m=+209.164527963" watchObservedRunningTime="2026-04-23 13:34:48.41782311 +0000 UTC m=+209.165825067" Apr 23 13:34:48.438956 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:48.438932 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-58849bf947-psjnv"] Apr 23 13:34:48.441612 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:34:48.441584 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod552f95c7_627a_4a74_b8ed_f0e8f2abdfa1.slice/crio-4b89352a3b61832952a9ac9aac9774242e110c97d6eac49ae2fb4a21a371798f WatchSource:0}: Error finding container 4b89352a3b61832952a9ac9aac9774242e110c97d6eac49ae2fb4a21a371798f: Status 404 returned error can't find the container with id 4b89352a3b61832952a9ac9aac9774242e110c97d6eac49ae2fb4a21a371798f Apr 23 13:34:49.102180 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.102142 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:49.102600 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.102266 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:49.107686 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.107665 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:49.401916 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.401828 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58849bf947-psjnv" event={"ID":"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1","Type":"ContainerStarted","Data":"ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07"} Apr 23 13:34:49.401916 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.401875 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58849bf947-psjnv" event={"ID":"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1","Type":"ContainerStarted","Data":"4b89352a3b61832952a9ac9aac9774242e110c97d6eac49ae2fb4a21a371798f"} Apr 23 13:34:49.406046 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.406017 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:34:49.420427 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:49.420379 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-58849bf947-psjnv" podStartSLOduration=2.420365045 podStartE2EDuration="2.420365045s" podCreationTimestamp="2026-04-23 13:34:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:34:49.418407787 +0000 UTC m=+210.166409753" watchObservedRunningTime="2026-04-23 13:34:49.420365045 +0000 UTC m=+210.168366999" Apr 23 13:34:56.528254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:56.528225 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:56.533072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:56.533049 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-monitoring/metrics-server-7f65b95986-th9fz" Apr 23 13:34:58.299090 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:58.299056 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:58.299556 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:58.299136 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:58.303636 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:58.303616 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:58.432391 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:58.432366 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:34:58.483669 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:34:58.483633 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7867955cd8-qm669"] Apr 23 13:35:02.668286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:02.668263 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-678lb_fa0283a7-9d03-45eb-8654-fca71445e53e/dns/0.log" Apr 23 13:35:02.674436 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:02.674417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-678lb_fa0283a7-9d03-45eb-8654-fca71445e53e/kube-rbac-proxy/0.log" Apr 23 13:35:03.047890 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:03.047866 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-89qsg_8e915a91-9dcb-4454-9ac4-0012727f6bdd/dns-node-resolver/0.log" Apr 23 13:35:23.503195 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.503156 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-7867955cd8-qm669" podUID="0bc4dfa7-dda2-49e5-b65b-280c78dceb55" containerName="console" containerID="cri-o://fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b" gracePeriod=15 Apr 23 13:35:23.804140 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.804120 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7867955cd8-qm669_0bc4dfa7-dda2-49e5-b65b-280c78dceb55/console/0.log" Apr 23 13:35:23.804236 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.804179 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:35:23.875097 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875069 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-serving-cert\") pod \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " Apr 23 13:35:23.875220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875127 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-service-ca\") pod \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " Apr 23 13:35:23.875220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875178 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-oauth-serving-cert\") pod \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " Apr 23 13:35:23.875355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875220 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-oauth-config\") pod \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " Apr 23 13:35:23.875355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875245 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-mz9zc\" (UniqueName: \"kubernetes.io/projected/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-kube-api-access-mz9zc\") pod \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " Apr 23 13:35:23.875355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875285 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-config\") pod \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\" (UID: \"0bc4dfa7-dda2-49e5-b65b-280c78dceb55\") " Apr 23 13:35:23.875633 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875601 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "0bc4dfa7-dda2-49e5-b65b-280c78dceb55" (UID: "0bc4dfa7-dda2-49e5-b65b-280c78dceb55"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:23.875633 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875604 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-service-ca" (OuterVolumeSpecName: "service-ca") pod "0bc4dfa7-dda2-49e5-b65b-280c78dceb55" (UID: "0bc4dfa7-dda2-49e5-b65b-280c78dceb55"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:23.875810 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.875637 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-config" (OuterVolumeSpecName: "console-config") pod "0bc4dfa7-dda2-49e5-b65b-280c78dceb55" (UID: "0bc4dfa7-dda2-49e5-b65b-280c78dceb55"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:23.877270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.877250 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "0bc4dfa7-dda2-49e5-b65b-280c78dceb55" (UID: "0bc4dfa7-dda2-49e5-b65b-280c78dceb55"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:23.877353 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.877281 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-kube-api-access-mz9zc" (OuterVolumeSpecName: "kube-api-access-mz9zc") pod "0bc4dfa7-dda2-49e5-b65b-280c78dceb55" (UID: "0bc4dfa7-dda2-49e5-b65b-280c78dceb55"). InnerVolumeSpecName "kube-api-access-mz9zc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:23.877395 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.877345 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "0bc4dfa7-dda2-49e5-b65b-280c78dceb55" (UID: "0bc4dfa7-dda2-49e5-b65b-280c78dceb55"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:23.976613 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.976589 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-oauth-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:23.976613 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.976611 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-mz9zc\" (UniqueName: \"kubernetes.io/projected/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-kube-api-access-mz9zc\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:23.976744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.976626 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:23.976744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.976639 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-console-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:23.976744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.976652 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-service-ca\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:23.976744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:23.976664 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/0bc4dfa7-dda2-49e5-b65b-280c78dceb55-oauth-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:24.501632 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.501608 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-7867955cd8-qm669_0bc4dfa7-dda2-49e5-b65b-280c78dceb55/console/0.log" Apr 23 13:35:24.501756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.501645 2576 generic.go:358] "Generic (PLEG): container finished" podID="0bc4dfa7-dda2-49e5-b65b-280c78dceb55" containerID="fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b" exitCode=2 Apr 23 13:35:24.501756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.501707 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-7867955cd8-qm669" Apr 23 13:35:24.501756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.501733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7867955cd8-qm669" event={"ID":"0bc4dfa7-dda2-49e5-b65b-280c78dceb55","Type":"ContainerDied","Data":"fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b"} Apr 23 13:35:24.501862 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.501774 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-7867955cd8-qm669" event={"ID":"0bc4dfa7-dda2-49e5-b65b-280c78dceb55","Type":"ContainerDied","Data":"d74078354512863d4c1f41acf2cfc770de5068d220bab3a5cdf6caf0fa327d53"} Apr 23 13:35:24.501862 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.501795 2576 scope.go:117] "RemoveContainer" containerID="fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b" Apr 23 13:35:24.510003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.509860 2576 scope.go:117] "RemoveContainer" containerID="fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b" Apr 23 13:35:24.510217 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:24.510171 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b\": container with ID starting with fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b not found: ID does not exist" containerID="fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b" Apr 23 13:35:24.510217 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.510196 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b"} err="failed to get container status \"fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b\": rpc error: code = NotFound desc = could not find container \"fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b\": container with ID starting with fd99afef40d8a1e08dba7b49e18c875df5e30d6a1a912e12b18d5fcfeacbe82b not found: ID does not exist" Apr 23 13:35:24.522733 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.522712 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-7867955cd8-qm669"] Apr 23 13:35:24.526675 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:24.526657 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-7867955cd8-qm669"] Apr 23 13:35:25.825585 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:25.825554 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0bc4dfa7-dda2-49e5-b65b-280c78dceb55" path="/var/lib/kubelet/pods/0bc4dfa7-dda2-49e5-b65b-280c78dceb55/volumes" Apr 23 13:35:31.531745 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:31.531715 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:35:31.533950 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:31.533932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-certs\" (UniqueName: \"kubernetes.io/secret/6bc685f5-9bf7-4830-a9ad-e4622169dcdb-metrics-certs\") pod \"network-metrics-daemon-ms5b6\" (UID: \"6bc685f5-9bf7-4830-a9ad-e4622169dcdb\") " pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:35:31.724019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:31.723991 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-multus\"/\"metrics-daemon-sa-dockercfg-lbjnn\"" Apr 23 13:35:31.731563 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:31.731544 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-multus/network-metrics-daemon-ms5b6" Apr 23 13:35:31.859063 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:31.859031 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-multus/network-metrics-daemon-ms5b6"] Apr 23 13:35:31.862435 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:35:31.862405 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod6bc685f5_9bf7_4830_a9ad_e4622169dcdb.slice/crio-793b590fd51cd3ea0445b0882a4abcdb4b5b384284517c061a6d2794fe0b2f99 WatchSource:0}: Error finding container 793b590fd51cd3ea0445b0882a4abcdb4b5b384284517c061a6d2794fe0b2f99: Status 404 returned error can't find the container with id 793b590fd51cd3ea0445b0882a4abcdb4b5b384284517c061a6d2794fe0b2f99 Apr 23 13:35:32.218556 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.218480 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:32.218901 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.218864 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="alertmanager" containerID="cri-o://8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa" gracePeriod=120 Apr 23 13:35:32.219092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.218941 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-metric" containerID="cri-o://793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a" gracePeriod=120 Apr 23 13:35:32.219092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.218941 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-web" containerID="cri-o://63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690" gracePeriod=120 Apr 23 13:35:32.219092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.218983 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="config-reloader" containerID="cri-o://709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324" gracePeriod=120 Apr 23 13:35:32.219092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.218999 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy" containerID="cri-o://a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1" gracePeriod=120 Apr 23 13:35:32.219092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.219033 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-monitoring/alertmanager-main-0" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="prom-label-proxy" containerID="cri-o://60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f" gracePeriod=120 Apr 23 13:35:32.533010 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.532983 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f" exitCode=0 Apr 23 13:35:32.533010 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533006 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1" exitCode=0 Apr 23 13:35:32.533010 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533012 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324" exitCode=0 Apr 23 13:35:32.533010 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533018 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa" exitCode=0 Apr 23 13:35:32.533535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f"} Apr 23 13:35:32.533535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533096 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1"} Apr 23 13:35:32.533535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533110 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324"} Apr 23 13:35:32.533535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.533124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa"} Apr 23 13:35:32.534245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:32.534218 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ms5b6" event={"ID":"6bc685f5-9bf7-4830-a9ad-e4622169dcdb","Type":"ContainerStarted","Data":"793b590fd51cd3ea0445b0882a4abcdb4b5b384284517c061a6d2794fe0b2f99"} Apr 23 13:35:33.437626 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.437603 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.539142 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.539112 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ms5b6" event={"ID":"6bc685f5-9bf7-4830-a9ad-e4622169dcdb","Type":"ContainerStarted","Data":"22f9dffd6d5853c08697aa56feb864abe41addc131ff49bf4fc4c6fea44f2cea"} Apr 23 13:35:33.539583 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.539560 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-multus/network-metrics-daemon-ms5b6" event={"ID":"6bc685f5-9bf7-4830-a9ad-e4622169dcdb","Type":"ContainerStarted","Data":"aad526a0cd0a95b311f597f880f4a511e0e638c45b7252766319f0805db31963"} Apr 23 13:35:33.542163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542139 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a" exitCode=0 Apr 23 13:35:33.542163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542162 2576 generic.go:358] "Generic (PLEG): container finished" podID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerID="63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690" exitCode=0 Apr 23 13:35:33.542315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542206 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a"} Apr 23 13:35:33.542315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690"} Apr 23 13:35:33.542315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef","Type":"ContainerDied","Data":"e56be08d179b3d659c8db8ec617aed6825833bda5ced3edb0ff574c5705eb615"} Apr 23 13:35:33.542315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542253 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.542529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.542259 2576 scope.go:117] "RemoveContainer" containerID="60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f" Apr 23 13:35:33.548407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548381 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-main-db\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548484 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548423 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-web\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548484 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548454 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-volume\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548578 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548489 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548578 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548518 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-r2zvm\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-kube-api-access-r2zvm\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548696 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548551 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-metric\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548750 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548720 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-cluster-tls-config\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548750 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-metrics-client-ca\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548778 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548895 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548811 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-main-tls\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548895 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548677 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-main-db" (OuterVolumeSpecName: "alertmanager-main-db") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "alertmanager-main-db". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:33.548895 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548846 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-out\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.548895 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548880 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-tls-assets\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.549130 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.548908 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-web-config\") pod \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\" (UID: \"0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef\") " Apr 23 13:35:33.549192 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.549149 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-main-db\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.549192 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.549153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle" (OuterVolumeSpecName: "alertmanager-trusted-ca-bundle") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "alertmanager-trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:33.549552 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.549525 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-metrics-client-ca" (OuterVolumeSpecName: "metrics-client-ca") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "metrics-client-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:35:33.550058 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.550037 2576 scope.go:117] "RemoveContainer" containerID="793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a" Apr 23 13:35:33.551739 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.551705 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-out" (OuterVolumeSpecName: "config-out") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "config-out". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:35:33.551933 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.551908 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-tls-assets" (OuterVolumeSpecName: "tls-assets") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "tls-assets". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:33.552249 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.552204 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-kube-api-access-r2zvm" (OuterVolumeSpecName: "kube-api-access-r2zvm") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "kube-api-access-r2zvm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:35:33.552249 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.552219 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-metric" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-metric") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-metric". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.552488 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.552293 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-web" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy-web") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy-web". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.553681 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.553655 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy" (OuterVolumeSpecName: "secret-alertmanager-kube-rbac-proxy") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "secret-alertmanager-kube-rbac-proxy". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.553787 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.553677 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-main-tls" (OuterVolumeSpecName: "secret-alertmanager-main-tls") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "secret-alertmanager-main-tls". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.553848 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.553819 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-volume" (OuterVolumeSpecName: "config-volume") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "config-volume". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.556528 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.556314 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-cluster-tls-config" (OuterVolumeSpecName: "cluster-tls-config") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "cluster-tls-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.559510 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.559202 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-multus/network-metrics-daemon-ms5b6" podStartSLOduration=253.624170658 podStartE2EDuration="4m14.55918674s" podCreationTimestamp="2026-04-23 13:31:19 +0000 UTC" firstStartedPulling="2026-04-23 13:35:31.864169023 +0000 UTC m=+252.612170959" lastFinishedPulling="2026-04-23 13:35:32.799185103 +0000 UTC m=+253.547187041" observedRunningTime="2026-04-23 13:35:33.558619635 +0000 UTC m=+254.306621604" watchObservedRunningTime="2026-04-23 13:35:33.55918674 +0000 UTC m=+254.307188697" Apr 23 13:35:33.563504 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.563412 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-web-config" (OuterVolumeSpecName: "web-config") pod "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" (UID: "0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef"). InnerVolumeSpecName "web-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:35:33.573731 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.573711 2576 scope.go:117] "RemoveContainer" containerID="a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1" Apr 23 13:35:33.579726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.579710 2576 scope.go:117] "RemoveContainer" containerID="63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690" Apr 23 13:35:33.585484 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.585469 2576 scope.go:117] "RemoveContainer" containerID="709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324" Apr 23 13:35:33.591541 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.591528 2576 scope.go:117] "RemoveContainer" containerID="8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa" Apr 23 13:35:33.597638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.597624 2576 scope.go:117] "RemoveContainer" containerID="556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1" Apr 23 13:35:33.603306 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.603291 2576 scope.go:117] "RemoveContainer" containerID="60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f" Apr 23 13:35:33.603524 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.603506 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f\": container with ID starting with 60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f not found: ID does not exist" containerID="60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f" Apr 23 13:35:33.603571 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.603532 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f"} err="failed to get container status \"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f\": rpc error: code = NotFound desc = could not find container \"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f\": container with ID starting with 60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f not found: ID does not exist" Apr 23 13:35:33.603571 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.603547 2576 scope.go:117] "RemoveContainer" containerID="793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a" Apr 23 13:35:33.603757 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.603738 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a\": container with ID starting with 793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a not found: ID does not exist" containerID="793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a" Apr 23 13:35:33.603798 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.603760 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a"} err="failed to get container status \"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a\": rpc error: code = NotFound desc = could not find container \"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a\": container with ID starting with 793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a not found: ID does not exist" Apr 23 13:35:33.603798 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.603774 2576 scope.go:117] "RemoveContainer" containerID="a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1" Apr 23 13:35:33.603975 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.603958 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1\": container with ID starting with a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1 not found: ID does not exist" containerID="a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1" Apr 23 13:35:33.604034 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.603984 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1"} err="failed to get container status \"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1\": rpc error: code = NotFound desc = could not find container \"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1\": container with ID starting with a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1 not found: ID does not exist" Apr 23 13:35:33.604034 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604007 2576 scope.go:117] "RemoveContainer" containerID="63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690" Apr 23 13:35:33.604198 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.604184 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690\": container with ID starting with 63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690 not found: ID does not exist" containerID="63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690" Apr 23 13:35:33.604235 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604202 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690"} err="failed to get container status \"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690\": rpc error: code = NotFound desc = could not find container \"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690\": container with ID starting with 63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690 not found: ID does not exist" Apr 23 13:35:33.604235 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604214 2576 scope.go:117] "RemoveContainer" containerID="709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324" Apr 23 13:35:33.604398 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.604384 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324\": container with ID starting with 709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324 not found: ID does not exist" containerID="709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324" Apr 23 13:35:33.604453 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604400 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324"} err="failed to get container status \"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324\": rpc error: code = NotFound desc = could not find container \"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324\": container with ID starting with 709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324 not found: ID does not exist" Apr 23 13:35:33.604453 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604411 2576 scope.go:117] "RemoveContainer" containerID="8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa" Apr 23 13:35:33.604573 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.604560 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa\": container with ID starting with 8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa not found: ID does not exist" containerID="8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa" Apr 23 13:35:33.604611 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604577 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa"} err="failed to get container status \"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa\": rpc error: code = NotFound desc = could not find container \"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa\": container with ID starting with 8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa not found: ID does not exist" Apr 23 13:35:33.604611 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604591 2576 scope.go:117] "RemoveContainer" containerID="556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1" Apr 23 13:35:33.604767 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:35:33.604752 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1\": container with ID starting with 556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1 not found: ID does not exist" containerID="556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1" Apr 23 13:35:33.604821 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604775 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1"} err="failed to get container status \"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1\": rpc error: code = NotFound desc = could not find container \"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1\": container with ID starting with 556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1 not found: ID does not exist" Apr 23 13:35:33.604821 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.604794 2576 scope.go:117] "RemoveContainer" containerID="60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f" Apr 23 13:35:33.605032 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605012 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f"} err="failed to get container status \"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f\": rpc error: code = NotFound desc = could not find container \"60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f\": container with ID starting with 60ec4e01fae8aa2d6446057ea20e9738c843e45f3874a645cb3180119f0a408f not found: ID does not exist" Apr 23 13:35:33.605032 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605031 2576 scope.go:117] "RemoveContainer" containerID="793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a" Apr 23 13:35:33.605214 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605199 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a"} err="failed to get container status \"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a\": rpc error: code = NotFound desc = could not find container \"793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a\": container with ID starting with 793d3c0921921fb4be51cce3bac485a8796f6cb274f49334481f425e5892a49a not found: ID does not exist" Apr 23 13:35:33.605256 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605214 2576 scope.go:117] "RemoveContainer" containerID="a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1" Apr 23 13:35:33.605404 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605379 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1"} err="failed to get container status \"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1\": rpc error: code = NotFound desc = could not find container \"a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1\": container with ID starting with a577fc74c0d6b48f935b55869dbe599b9793676c08f96d2ff570fd8407c8feb1 not found: ID does not exist" Apr 23 13:35:33.605451 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605408 2576 scope.go:117] "RemoveContainer" containerID="63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690" Apr 23 13:35:33.605631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605612 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690"} err="failed to get container status \"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690\": rpc error: code = NotFound desc = could not find container \"63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690\": container with ID starting with 63bdf4ef9b6a133e8142bf93a2c0c3d8214c890fc6a501e15ab41620b4c1f690 not found: ID does not exist" Apr 23 13:35:33.605681 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605631 2576 scope.go:117] "RemoveContainer" containerID="709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324" Apr 23 13:35:33.605841 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605826 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324"} err="failed to get container status \"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324\": rpc error: code = NotFound desc = could not find container \"709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324\": container with ID starting with 709da78e4f17b30be644ba79fe84b21937affde733293af892aae1aaa4535324 not found: ID does not exist" Apr 23 13:35:33.605885 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.605841 2576 scope.go:117] "RemoveContainer" containerID="8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa" Apr 23 13:35:33.606015 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.606001 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa"} err="failed to get container status \"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa\": rpc error: code = NotFound desc = could not find container \"8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa\": container with ID starting with 8898213e8ee082ec951f981f0c08cb702ee515ecdcb28b56cda5eea7ad48e8aa not found: ID does not exist" Apr 23 13:35:33.606072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.606014 2576 scope.go:117] "RemoveContainer" containerID="556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1" Apr 23 13:35:33.606190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.606176 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1"} err="failed to get container status \"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1\": rpc error: code = NotFound desc = could not find container \"556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1\": container with ID starting with 556c327dc3b1c4ca49e577c37eaa047428d17d13abbec3632fafb57c2914bfe1 not found: ID does not exist" Apr 23 13:35:33.649837 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649814 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-web\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.649922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649840 2576 reconciler_common.go:299] "Volume detached for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-volume\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.649922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649855 2576 reconciler_common.go:299] "Volume detached for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-alertmanager-trusted-ca-bundle\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.649922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649868 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-r2zvm\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-kube-api-access-r2zvm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.649922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649882 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy-metric\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.649922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649896 2576 reconciler_common.go:299] "Volume detached for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-cluster-tls-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.649922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649908 2576 reconciler_common.go:299] "Volume detached for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-metrics-client-ca\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.650107 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649922 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-kube-rbac-proxy\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.650107 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649935 2576 reconciler_common.go:299] "Volume detached for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-secret-alertmanager-main-tls\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.650107 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649946 2576 reconciler_common.go:299] "Volume detached for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-config-out\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.650107 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649958 2576 reconciler_common.go:299] "Volume detached for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-tls-assets\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.650107 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.649969 2576 reconciler_common.go:299] "Volume detached for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef-web-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:35:33.861172 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.861146 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:33.868081 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.868058 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:33.900152 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900129 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:33.900508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900493 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="alertmanager" Apr 23 13:35:33.900557 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900512 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="alertmanager" Apr 23 13:35:33.900557 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900522 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0bc4dfa7-dda2-49e5-b65b-280c78dceb55" containerName="console" Apr 23 13:35:33.900557 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900530 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0bc4dfa7-dda2-49e5-b65b-280c78dceb55" containerName="console" Apr 23 13:35:33.900557 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900546 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="init-config-reloader" Apr 23 13:35:33.900557 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900555 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="init-config-reloader" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900577 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-metric" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900585 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-metric" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900615 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900625 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900635 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="prom-label-proxy" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900643 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="prom-label-proxy" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900659 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="config-reloader" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900667 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="config-reloader" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900677 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-web" Apr 23 13:35:33.900714 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900686 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-web" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900745 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="config-reloader" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900755 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900765 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="alertmanager" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900774 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-web" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900785 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="prom-label-proxy" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900794 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0bc4dfa7-dda2-49e5-b65b-280c78dceb55" containerName="console" Apr 23 13:35:33.900979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.900803 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" containerName="kube-rbac-proxy-metric" Apr 23 13:35:33.906008 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.905990 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909290 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909301 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls-assets-0\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909317 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-cluster-tls-config\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909373 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-metric\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909392 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-dockercfg-jf6mr\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909426 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-web-config\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909436 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-kube-rbac-proxy-web\"" Apr 23 13:35:33.909529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909299 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-tls\"" Apr 23 13:35:33.909894 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.909561 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"alertmanager-main-generated\"" Apr 23 13:35:33.913967 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.913932 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"alertmanager-trusted-ca-bundle\"" Apr 23 13:35:33.919225 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.919201 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:33.953398 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953379 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953406 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-web-config\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953422 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-ddtcr\" (UniqueName: \"kubernetes.io/projected/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-kube-api-access-ddtcr\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953458 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953486 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-config-out\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953521 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953545 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953567 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953623 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953797 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953663 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953797 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953702 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:33.953797 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:33.953729 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054229 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054202 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-ddtcr\" (UniqueName: \"kubernetes.io/projected/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-kube-api-access-ddtcr\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054234 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-config-out\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054268 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054289 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054375 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054347 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054520 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054558 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054601 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054851 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054677 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.054851 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.054701 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-web-config\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.055221 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.055195 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-metrics-client-ca\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.055318 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.055292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-alertmanager-trusted-ca-bundle\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057236 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057210 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-assets\" (UniqueName: \"kubernetes.io/projected/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-tls-assets\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057347 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057301 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-main-tls\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-main-tls\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057412 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cluster-tls-config\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-cluster-tls-config\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057469 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-out\" (UniqueName: \"kubernetes.io/empty-dir/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-config-out\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057512 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-metric\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy-metric\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057525 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"alertmanager-main-db\" (UniqueName: \"kubernetes.io/empty-dir/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-alertmanager-main-db\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057826 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.057916 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.057844 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"web-config\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-web-config\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.058114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.058095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"config-volume\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-config-volume\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.059193 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.059175 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-alertmanager-kube-rbac-proxy-web\" (UniqueName: \"kubernetes.io/secret/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-secret-alertmanager-kube-rbac-proxy-web\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.062665 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.062643 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-ddtcr\" (UniqueName: \"kubernetes.io/projected/1e3c36ce-c4ae-4013-80e3-e50f781cccbb-kube-api-access-ddtcr\") pod \"alertmanager-main-0\" (UID: \"1e3c36ce-c4ae-4013-80e3-e50f781cccbb\") " pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.219131 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.219063 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/alertmanager-main-0" Apr 23 13:35:34.346283 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.346259 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/alertmanager-main-0"] Apr 23 13:35:34.348675 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:35:34.348649 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1e3c36ce_c4ae_4013_80e3_e50f781cccbb.slice/crio-f9a6c70d0c30201d00cb94add12e1d38459d5feef171cc61434cad37bf10cdac WatchSource:0}: Error finding container f9a6c70d0c30201d00cb94add12e1d38459d5feef171cc61434cad37bf10cdac: Status 404 returned error can't find the container with id f9a6c70d0c30201d00cb94add12e1d38459d5feef171cc61434cad37bf10cdac Apr 23 13:35:34.546982 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.546955 2576 generic.go:358] "Generic (PLEG): container finished" podID="1e3c36ce-c4ae-4013-80e3-e50f781cccbb" containerID="9eb91c0ef703ac185fd620046eaf53b1be3fcf82345f3d903e3c5462f7d7f096" exitCode=0 Apr 23 13:35:34.547417 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.547028 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerDied","Data":"9eb91c0ef703ac185fd620046eaf53b1be3fcf82345f3d903e3c5462f7d7f096"} Apr 23 13:35:34.547417 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:34.547050 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"f9a6c70d0c30201d00cb94add12e1d38459d5feef171cc61434cad37bf10cdac"} Apr 23 13:35:35.553398 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.553364 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"d0209bbb12aa0aaad2e0705d0a3e02117ce63f2a7b5e62f4ba684853bd336ebe"} Apr 23 13:35:35.553398 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.553403 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"bcd14c7860b9cda216cc09e19fc481f53dd91ae9c1e14eeb8cb7a86e0e685b99"} Apr 23 13:35:35.553756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.553414 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"4e790089a41864e32a8a2b0c33a606f456fcb67e01c2387475bf2277a451a6e3"} Apr 23 13:35:35.553756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.553424 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"c875fad2bd177f76f890d6dcc76bb8b39d4b85fc4d30bc4150f499d64fd150ff"} Apr 23 13:35:35.553756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.553431 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"2e8cf7768b574497059c67eb32c7cae8390a0baf281b5c91cf6009b91df66708"} Apr 23 13:35:35.553756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.553439 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/alertmanager-main-0" event={"ID":"1e3c36ce-c4ae-4013-80e3-e50f781cccbb","Type":"ContainerStarted","Data":"eddbe9f3cc1c8a776760b3ca7170bf09e0c4d9f1991547480f56413ae77d5f0c"} Apr 23 13:35:35.579552 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.579514 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/alertmanager-main-0" podStartSLOduration=2.5795003210000003 podStartE2EDuration="2.579500321s" podCreationTimestamp="2026-04-23 13:35:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:35:35.577888879 +0000 UTC m=+256.325890834" watchObservedRunningTime="2026-04-23 13:35:35.579500321 +0000 UTC m=+256.327502274" Apr 23 13:35:35.825418 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:35.825313 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef" path="/var/lib/kubelet/pods/0c944e1b-5a9c-43e7-9d12-58a2eda6c7ef/volumes" Apr 23 13:35:36.245644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.245493 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk"] Apr 23 13:35:36.249552 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.249532 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.252148 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.252130 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-client-serving-certs-ca-bundle\"" Apr 23 13:35:36.252583 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.252558 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client\"" Apr 23 13:35:36.252792 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.252773 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-tls\"" Apr 23 13:35:36.253032 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.252991 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-kube-rbac-proxy-config\"" Apr 23 13:35:36.253702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.253678 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"telemeter-client-dockercfg-vtmkp\"" Apr 23 13:35:36.253790 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.253745 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-monitoring\"/\"federate-client-certs\"" Apr 23 13:35:36.259090 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.259063 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-monitoring\"/\"telemeter-trusted-ca-bundle-8i12ta5c71j38\"" Apr 23 13:35:36.260839 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.260806 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk"] Apr 23 13:35:36.374794 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.374763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-b6d5j\" (UniqueName: \"kubernetes.io/projected/40352799-93ad-4426-a7dc-d5487b70dc40-kube-api-access-b6d5j\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.374794 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.374797 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-telemeter-client-tls\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.375003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.374823 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-serving-certs-ca-bundle\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.375003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.374922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-federate-client-tls\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.375003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.374968 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-secret-telemeter-client\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.375121 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.375013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.375121 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.375033 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.375121 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.375055 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-metrics-client-ca\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476227 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-b6d5j\" (UniqueName: \"kubernetes.io/projected/40352799-93ad-4426-a7dc-d5487b70dc40-kube-api-access-b6d5j\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476269 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-telemeter-client-tls\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476391 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-serving-certs-ca-bundle\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476565 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476550 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-federate-client-tls\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476619 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476584 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-secret-telemeter-client\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476672 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476626 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.476672 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.476654 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.477470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.477442 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"serving-certs-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-serving-certs-ca-bundle\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.478090 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.478065 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-telemeter-trusted-ca-bundle\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.478187 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.478133 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-metrics-client-ca\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.478802 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.478778 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-client-ca\" (UniqueName: \"kubernetes.io/configmap/40352799-93ad-4426-a7dc-d5487b70dc40-metrics-client-ca\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.479423 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.479398 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"telemeter-client-tls\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-telemeter-client-tls\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.479749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.479722 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client-kube-rbac-proxy-config\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-secret-telemeter-client-kube-rbac-proxy-config\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.479917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.479897 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"secret-telemeter-client\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-secret-telemeter-client\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.483356 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.482625 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"federate-client-tls\" (UniqueName: \"kubernetes.io/secret/40352799-93ad-4426-a7dc-d5487b70dc40-federate-client-tls\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.485669 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.485644 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-b6d5j\" (UniqueName: \"kubernetes.io/projected/40352799-93ad-4426-a7dc-d5487b70dc40-kube-api-access-b6d5j\") pod \"telemeter-client-59b6ff9b68-xmmxk\" (UID: \"40352799-93ad-4426-a7dc-d5487b70dc40\") " pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.561373 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.561339 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" Apr 23 13:35:36.692823 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:36.692772 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk"] Apr 23 13:35:36.695183 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:35:36.695160 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod40352799_93ad_4426_a7dc_d5487b70dc40.slice/crio-4528c7f96b5c602fd2749a29bbf19ba02d9fdedc414f51dc84b57d6ab29f7320 WatchSource:0}: Error finding container 4528c7f96b5c602fd2749a29bbf19ba02d9fdedc414f51dc84b57d6ab29f7320: Status 404 returned error can't find the container with id 4528c7f96b5c602fd2749a29bbf19ba02d9fdedc414f51dc84b57d6ab29f7320 Apr 23 13:35:37.560974 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:37.560931 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" event={"ID":"40352799-93ad-4426-a7dc-d5487b70dc40","Type":"ContainerStarted","Data":"4528c7f96b5c602fd2749a29bbf19ba02d9fdedc414f51dc84b57d6ab29f7320"} Apr 23 13:35:38.565245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:38.565222 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" event={"ID":"40352799-93ad-4426-a7dc-d5487b70dc40","Type":"ContainerStarted","Data":"8149d16a217f781d74f215b92d53da7196b0ac344dc66b5ada926acf11dbc465"} Apr 23 13:35:38.565531 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:38.565257 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" event={"ID":"40352799-93ad-4426-a7dc-d5487b70dc40","Type":"ContainerStarted","Data":"37cbeccd9826e7b7a9bfab64cb67e11a8ab6b277259d11a07259690677a7fa4a"} Apr 23 13:35:39.569731 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:39.569695 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" event={"ID":"40352799-93ad-4426-a7dc-d5487b70dc40","Type":"ContainerStarted","Data":"af060d07d076b47cce5cc31180df27b234aaec348547e40d9422f38026dd1642"} Apr 23 13:35:39.593845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:39.593794 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-monitoring/telemeter-client-59b6ff9b68-xmmxk" podStartSLOduration=1.854119089 podStartE2EDuration="3.593779424s" podCreationTimestamp="2026-04-23 13:35:36 +0000 UTC" firstStartedPulling="2026-04-23 13:35:36.700552401 +0000 UTC m=+257.448554334" lastFinishedPulling="2026-04-23 13:35:38.440212722 +0000 UTC m=+259.188214669" observedRunningTime="2026-04-23 13:35:39.592748604 +0000 UTC m=+260.340750560" watchObservedRunningTime="2026-04-23 13:35:39.593779424 +0000 UTC m=+260.341781378" Apr 23 13:35:40.304225 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.304190 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-659678f6d7-zj8nd"] Apr 23 13:35:40.307746 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.307720 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.317852 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.317828 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-659678f6d7-zj8nd"] Apr 23 13:35:40.415507 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415471 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-service-ca\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.415667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415513 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-oauth-config\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.415667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-wz7tf\" (UniqueName: \"kubernetes.io/projected/75985599-9608-41f3-aa52-b39ef909fa1d-kube-api-access-wz7tf\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.415667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415591 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-serving-cert\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.415667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415629 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-trusted-ca-bundle\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.415667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415652 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-oauth-serving-cert\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.415820 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.415694 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-console-config\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.516602 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516556 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-console-config\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.516795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516650 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-service-ca\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.516795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516679 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-oauth-config\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.516795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-wz7tf\" (UniqueName: \"kubernetes.io/projected/75985599-9608-41f3-aa52-b39ef909fa1d-kube-api-access-wz7tf\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.516795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516733 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-serving-cert\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.516795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516770 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-trusted-ca-bundle\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.517041 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.516799 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-oauth-serving-cert\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.517533 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.517509 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-oauth-serving-cert\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.518216 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.518190 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-service-ca\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.518310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.518257 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-console-config\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.518442 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.518413 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-trusted-ca-bundle\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.520386 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.520367 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-serving-cert\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.520488 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.520468 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-oauth-config\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.525447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.525424 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-wz7tf\" (UniqueName: \"kubernetes.io/projected/75985599-9608-41f3-aa52-b39ef909fa1d-kube-api-access-wz7tf\") pod \"console-659678f6d7-zj8nd\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.619489 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.619405 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:40.938056 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:40.938018 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-659678f6d7-zj8nd"] Apr 23 13:35:40.940482 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:35:40.940454 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod75985599_9608_41f3_aa52_b39ef909fa1d.slice/crio-23c30b6934abebfd22d67ba69165ed2965fd5bce991677addc14678736615b61 WatchSource:0}: Error finding container 23c30b6934abebfd22d67ba69165ed2965fd5bce991677addc14678736615b61: Status 404 returned error can't find the container with id 23c30b6934abebfd22d67ba69165ed2965fd5bce991677addc14678736615b61 Apr 23 13:35:41.577159 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:41.577120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-659678f6d7-zj8nd" event={"ID":"75985599-9608-41f3-aa52-b39ef909fa1d","Type":"ContainerStarted","Data":"d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393"} Apr 23 13:35:41.577159 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:41.577157 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-659678f6d7-zj8nd" event={"ID":"75985599-9608-41f3-aa52-b39ef909fa1d","Type":"ContainerStarted","Data":"23c30b6934abebfd22d67ba69165ed2965fd5bce991677addc14678736615b61"} Apr 23 13:35:41.597719 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:41.597682 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-659678f6d7-zj8nd" podStartSLOduration=1.597669805 podStartE2EDuration="1.597669805s" podCreationTimestamp="2026-04-23 13:35:40 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:35:41.595789116 +0000 UTC m=+262.343791071" watchObservedRunningTime="2026-04-23 13:35:41.597669805 +0000 UTC m=+262.345671757" Apr 23 13:35:50.619549 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:50.619516 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:50.619549 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:50.619556 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:50.623989 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:50.623964 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:51.611313 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:51.611284 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:35:51.657564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:35:51.657524 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58849bf947-psjnv"] Apr 23 13:36:16.676084 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:16.676025 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-58849bf947-psjnv" podUID="552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" containerName="console" containerID="cri-o://ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07" gracePeriod=15 Apr 23 13:36:16.910054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:16.910026 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58849bf947-psjnv_552f95c7-627a-4a74-b8ed-f0e8f2abdfa1/console/0.log" Apr 23 13:36:16.910158 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:16.910082 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:36:17.005812 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005738 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-oauth-config\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.005812 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005782 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-trusted-ca-bundle\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.006011 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005815 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-service-ca\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.006011 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005836 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-config\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.006011 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005877 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-serving-cert\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.006011 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005905 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-oauth-serving-cert\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.006011 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.005956 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-dkb9w\" (UniqueName: \"kubernetes.io/projected/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-kube-api-access-dkb9w\") pod \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\" (UID: \"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1\") " Apr 23 13:36:17.006313 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.006285 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:17.006399 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.006293 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-service-ca" (OuterVolumeSpecName: "service-ca") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:17.006454 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.006414 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-config" (OuterVolumeSpecName: "console-config") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:17.006494 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.006441 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:36:17.008128 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.008094 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:17.008220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.008132 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:36:17.008220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.008173 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-kube-api-access-dkb9w" (OuterVolumeSpecName: "kube-api-access-dkb9w") pod "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" (UID: "552f95c7-627a-4a74-b8ed-f0e8f2abdfa1"). InnerVolumeSpecName "kube-api-access-dkb9w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:36:17.107370 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107322 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-dkb9w\" (UniqueName: \"kubernetes.io/projected/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-kube-api-access-dkb9w\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.107370 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107365 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-oauth-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.107545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107377 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-trusted-ca-bundle\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.107545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107390 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-service-ca\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.107545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107402 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.107545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107414 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-console-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.107545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.107425 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1-oauth-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:36:17.681185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.681160 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-58849bf947-psjnv_552f95c7-627a-4a74-b8ed-f0e8f2abdfa1/console/0.log" Apr 23 13:36:17.681651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.681198 2576 generic.go:358] "Generic (PLEG): container finished" podID="552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" containerID="ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07" exitCode=2 Apr 23 13:36:17.681651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.681227 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58849bf947-psjnv" event={"ID":"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1","Type":"ContainerDied","Data":"ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07"} Apr 23 13:36:17.681651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.681248 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-58849bf947-psjnv" event={"ID":"552f95c7-627a-4a74-b8ed-f0e8f2abdfa1","Type":"ContainerDied","Data":"4b89352a3b61832952a9ac9aac9774242e110c97d6eac49ae2fb4a21a371798f"} Apr 23 13:36:17.681651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.681261 2576 scope.go:117] "RemoveContainer" containerID="ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07" Apr 23 13:36:17.681651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.681270 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-58849bf947-psjnv" Apr 23 13:36:17.688899 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.688882 2576 scope.go:117] "RemoveContainer" containerID="ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07" Apr 23 13:36:17.689170 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:36:17.689147 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07\": container with ID starting with ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07 not found: ID does not exist" containerID="ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07" Apr 23 13:36:17.689260 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.689176 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07"} err="failed to get container status \"ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07\": rpc error: code = NotFound desc = could not find container \"ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07\": container with ID starting with ad8e20c5b54ac318967b2b804c7c531ddd3581fd2e93f868889818aea819ce07 not found: ID does not exist" Apr 23 13:36:17.702290 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.702266 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-58849bf947-psjnv"] Apr 23 13:36:17.706994 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.706974 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-58849bf947-psjnv"] Apr 23 13:36:17.824758 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:17.824734 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" path="/var/lib/kubelet/pods/552f95c7-627a-4a74-b8ed-f0e8f2abdfa1/volumes" Apr 23 13:36:19.695538 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:19.695511 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:36:19.695963 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:19.695856 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:36:19.711720 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:36:19.711698 2576 kubelet.go:1628] "Image garbage collection succeeded" Apr 23 13:37:06.743461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.743384 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-69f4894dbd-dzqvs"] Apr 23 13:37:06.743934 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.743694 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" containerName="console" Apr 23 13:37:06.743934 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.743708 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" containerName="console" Apr 23 13:37:06.743934 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.743770 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="552f95c7-627a-4a74-b8ed-f0e8f2abdfa1" containerName="console" Apr 23 13:37:06.745839 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.745811 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.757014 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.756994 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f4894dbd-dzqvs"] Apr 23 13:37:06.893127 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893098 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-trusted-ca-bundle\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.893127 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893128 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-oauth-config\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.893309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893162 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-config\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.893309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893225 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/1b3474d0-5402-44e4-b9b8-11c8a00bf034-kube-api-access-gktd8\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.893309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893249 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-service-ca\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.893309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893270 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-oauth-serving-cert\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.893309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.893295 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-serving-cert\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994070 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.993984 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-serving-cert\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994070 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.994034 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-trusted-ca-bundle\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.994082 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-oauth-config\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.994147 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-config\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.994191 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/1b3474d0-5402-44e4-b9b8-11c8a00bf034-kube-api-access-gktd8\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.994214 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-service-ca\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.994285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.994238 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-oauth-serving-cert\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.995056 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.995036 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-oauth-serving-cert\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.995151 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.995040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-trusted-ca-bundle\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.995194 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.995152 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-config\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.995194 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.995174 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-service-ca\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.996697 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.996661 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-serving-cert\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:06.996697 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:06.996684 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-oauth-config\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:07.001479 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.001460 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/1b3474d0-5402-44e4-b9b8-11c8a00bf034-kube-api-access-gktd8\") pod \"console-69f4894dbd-dzqvs\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:07.055445 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.055421 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:07.174905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.174881 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-69f4894dbd-dzqvs"] Apr 23 13:37:07.177356 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:37:07.177304 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod1b3474d0_5402_44e4_b9b8_11c8a00bf034.slice/crio-7638ba4b8a1ef42ebb7b5ec8144909ddccf0c118e3c7e555a3d0642578b86e2a WatchSource:0}: Error finding container 7638ba4b8a1ef42ebb7b5ec8144909ddccf0c118e3c7e555a3d0642578b86e2a: Status 404 returned error can't find the container with id 7638ba4b8a1ef42ebb7b5ec8144909ddccf0c118e3c7e555a3d0642578b86e2a Apr 23 13:37:07.179441 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.179425 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:37:07.826264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.826228 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f4894dbd-dzqvs" event={"ID":"1b3474d0-5402-44e4-b9b8-11c8a00bf034","Type":"ContainerStarted","Data":"da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd"} Apr 23 13:37:07.826640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.826270 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f4894dbd-dzqvs" event={"ID":"1b3474d0-5402-44e4-b9b8-11c8a00bf034","Type":"ContainerStarted","Data":"7638ba4b8a1ef42ebb7b5ec8144909ddccf0c118e3c7e555a3d0642578b86e2a"} Apr 23 13:37:07.843873 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:07.843818 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-69f4894dbd-dzqvs" podStartSLOduration=1.84380223 podStartE2EDuration="1.84380223s" podCreationTimestamp="2026-04-23 13:37:06 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:37:07.841689792 +0000 UTC m=+348.589691746" watchObservedRunningTime="2026-04-23 13:37:07.84380223 +0000 UTC m=+348.591804184" Apr 23 13:37:17.055628 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:17.055585 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:17.056025 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:17.055674 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:17.060499 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:17.060480 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:17.857258 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:17.857231 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:37:17.960774 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:17.960743 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-659678f6d7-zj8nd"] Apr 23 13:37:27.007722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.007688 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kube-system/global-pull-secret-syncer-x546v"] Apr 23 13:37:27.012029 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.012010 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.014440 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.014415 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kube-system\"/\"original-pull-secret\"" Apr 23 13:37:27.018185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.018162 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x546v"] Apr 23 13:37:27.041095 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.041072 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23dffbb8-70d5-4737-8b86-aa438dd71cff-original-pull-secret\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.041184 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.041101 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/23dffbb8-70d5-4737-8b86-aa438dd71cff-kubelet-config\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.041184 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.041179 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/23dffbb8-70d5-4737-8b86-aa438dd71cff-dbus\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.142100 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.142071 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/23dffbb8-70d5-4737-8b86-aa438dd71cff-dbus\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.142220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.142157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23dffbb8-70d5-4737-8b86-aa438dd71cff-original-pull-secret\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.142220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.142190 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/23dffbb8-70d5-4737-8b86-aa438dd71cff-kubelet-config\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.142315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.142251 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dbus\" (UniqueName: \"kubernetes.io/host-path/23dffbb8-70d5-4737-8b86-aa438dd71cff-dbus\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.142382 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.142345 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kubelet-config\" (UniqueName: \"kubernetes.io/host-path/23dffbb8-70d5-4737-8b86-aa438dd71cff-kubelet-config\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.144286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.144262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"original-pull-secret\" (UniqueName: \"kubernetes.io/secret/23dffbb8-70d5-4737-8b86-aa438dd71cff-original-pull-secret\") pod \"global-pull-secret-syncer-x546v\" (UID: \"23dffbb8-70d5-4737-8b86-aa438dd71cff\") " pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.322252 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.322227 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kube-system/global-pull-secret-syncer-x546v" Apr 23 13:37:27.438000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.437977 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kube-system/global-pull-secret-syncer-x546v"] Apr 23 13:37:27.440356 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:37:27.440319 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod23dffbb8_70d5_4737_8b86_aa438dd71cff.slice/crio-bc3202a8ed63a813846c9b2977055e4d59353cd0669aac1b65eea07255637189 WatchSource:0}: Error finding container bc3202a8ed63a813846c9b2977055e4d59353cd0669aac1b65eea07255637189: Status 404 returned error can't find the container with id bc3202a8ed63a813846c9b2977055e4d59353cd0669aac1b65eea07255637189 Apr 23 13:37:27.885711 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:27.885683 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x546v" event={"ID":"23dffbb8-70d5-4737-8b86-aa438dd71cff","Type":"ContainerStarted","Data":"bc3202a8ed63a813846c9b2977055e4d59353cd0669aac1b65eea07255637189"} Apr 23 13:37:31.897605 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:31.897562 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kube-system/global-pull-secret-syncer-x546v" event={"ID":"23dffbb8-70d5-4737-8b86-aa438dd71cff","Type":"ContainerStarted","Data":"56e0b089b7b8d25c5db95415338f322fc5921901789e6f1849061ea80cd8fc6b"} Apr 23 13:37:31.913597 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:31.913550 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kube-system/global-pull-secret-syncer-x546v" podStartSLOduration=2.311212259 podStartE2EDuration="5.913535518s" podCreationTimestamp="2026-04-23 13:37:26 +0000 UTC" firstStartedPulling="2026-04-23 13:37:27.44210578 +0000 UTC m=+368.190107716" lastFinishedPulling="2026-04-23 13:37:31.044429039 +0000 UTC m=+371.792430975" observedRunningTime="2026-04-23 13:37:31.913131039 +0000 UTC m=+372.661132994" watchObservedRunningTime="2026-04-23 13:37:31.913535518 +0000 UTC m=+372.661537472" Apr 23 13:37:42.979045 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:42.978984 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-659678f6d7-zj8nd" podUID="75985599-9608-41f3-aa52-b39ef909fa1d" containerName="console" containerID="cri-o://d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393" gracePeriod=15 Apr 23 13:37:43.210308 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.210289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-659678f6d7-zj8nd_75985599-9608-41f3-aa52-b39ef909fa1d/console/0.log" Apr 23 13:37:43.210435 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.210358 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:37:43.264922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.264847 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-service-ca\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.264922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.264898 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-trusted-ca-bundle\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.265079 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.264925 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-wz7tf\" (UniqueName: \"kubernetes.io/projected/75985599-9608-41f3-aa52-b39ef909fa1d-kube-api-access-wz7tf\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.265079 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265038 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-oauth-serving-cert\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.265157 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265141 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-serving-cert\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.265197 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265180 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-oauth-config\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.265245 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265205 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-console-config\") pod \"75985599-9608-41f3-aa52-b39ef909fa1d\" (UID: \"75985599-9608-41f3-aa52-b39ef909fa1d\") " Apr 23 13:37:43.265388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265358 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:43.265388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265383 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-service-ca" (OuterVolumeSpecName: "service-ca") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:43.265593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265460 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:43.265593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265486 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-service-ca\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.265593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265506 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-trusted-ca-bundle\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.265805 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.265691 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-console-config" (OuterVolumeSpecName: "console-config") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:37:43.267061 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.267040 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/75985599-9608-41f3-aa52-b39ef909fa1d-kube-api-access-wz7tf" (OuterVolumeSpecName: "kube-api-access-wz7tf") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "kube-api-access-wz7tf". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:37:43.267186 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.267165 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:37:43.267313 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.267292 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "75985599-9608-41f3-aa52-b39ef909fa1d" (UID: "75985599-9608-41f3-aa52-b39ef909fa1d"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:37:43.365869 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.365830 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.365869 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.365865 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/75985599-9608-41f3-aa52-b39ef909fa1d-console-oauth-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.365869 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.365875 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-console-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.366060 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.365884 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-wz7tf\" (UniqueName: \"kubernetes.io/projected/75985599-9608-41f3-aa52-b39ef909fa1d-kube-api-access-wz7tf\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.366060 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.365893 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/75985599-9608-41f3-aa52-b39ef909fa1d-oauth-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:37:43.932792 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.932767 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-659678f6d7-zj8nd_75985599-9608-41f3-aa52-b39ef909fa1d/console/0.log" Apr 23 13:37:43.932941 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.932805 2576 generic.go:358] "Generic (PLEG): container finished" podID="75985599-9608-41f3-aa52-b39ef909fa1d" containerID="d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393" exitCode=2 Apr 23 13:37:43.932941 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.932836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-659678f6d7-zj8nd" event={"ID":"75985599-9608-41f3-aa52-b39ef909fa1d","Type":"ContainerDied","Data":"d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393"} Apr 23 13:37:43.932941 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.932858 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-659678f6d7-zj8nd" event={"ID":"75985599-9608-41f3-aa52-b39ef909fa1d","Type":"ContainerDied","Data":"23c30b6934abebfd22d67ba69165ed2965fd5bce991677addc14678736615b61"} Apr 23 13:37:43.932941 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.932868 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-659678f6d7-zj8nd" Apr 23 13:37:43.933103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.932872 2576 scope.go:117] "RemoveContainer" containerID="d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393" Apr 23 13:37:43.940376 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.940360 2576 scope.go:117] "RemoveContainer" containerID="d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393" Apr 23 13:37:43.940608 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:37:43.940588 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393\": container with ID starting with d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393 not found: ID does not exist" containerID="d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393" Apr 23 13:37:43.940660 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.940617 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393"} err="failed to get container status \"d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393\": rpc error: code = NotFound desc = could not find container \"d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393\": container with ID starting with d71cf4958e5a2920197258d500d61ce25f81340a6e6a16da057353bf70275393 not found: ID does not exist" Apr 23 13:37:43.950644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.950624 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-659678f6d7-zj8nd"] Apr 23 13:37:43.954033 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:43.954009 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-659678f6d7-zj8nd"] Apr 23 13:37:45.825277 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:37:45.825241 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="75985599-9608-41f3-aa52-b39ef909fa1d" path="/var/lib/kubelet/pods/75985599-9608-41f3-aa52-b39ef909fa1d/volumes" Apr 23 13:38:06.951953 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.951915 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj"] Apr 23 13:38:06.952515 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.952226 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="75985599-9608-41f3-aa52-b39ef909fa1d" containerName="console" Apr 23 13:38:06.952515 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.952236 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="75985599-9608-41f3-aa52-b39ef909fa1d" containerName="console" Apr 23 13:38:06.952515 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.952294 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="75985599-9608-41f3-aa52-b39ef909fa1d" containerName="console" Apr 23 13:38:06.954183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.954165 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:06.956692 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.956669 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"kedaorg-certs\"" Apr 23 13:38:06.956794 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.956766 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"custom-metrics-autoscaler-operator-dockercfg-k4j62\"" Apr 23 13:38:06.956847 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.956795 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"openshift-service-ca.crt\"" Apr 23 13:38:06.956847 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.956823 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"kube-root-ca.crt\"" Apr 23 13:38:06.963896 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:06.963875 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj"] Apr 23 13:38:07.043108 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.043081 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8886de5d-d2be-43d1-b7b6-66465c14613b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj\" (UID: \"8886de5d-d2be-43d1-b7b6-66465c14613b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.043218 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.043181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-98x2b\" (UniqueName: \"kubernetes.io/projected/8886de5d-d2be-43d1-b7b6-66465c14613b-kube-api-access-98x2b\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj\" (UID: \"8886de5d-d2be-43d1-b7b6-66465c14613b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.144108 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.144088 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-98x2b\" (UniqueName: \"kubernetes.io/projected/8886de5d-d2be-43d1-b7b6-66465c14613b-kube-api-access-98x2b\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj\" (UID: \"8886de5d-d2be-43d1-b7b6-66465c14613b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.144218 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.144157 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8886de5d-d2be-43d1-b7b6-66465c14613b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj\" (UID: \"8886de5d-d2be-43d1-b7b6-66465c14613b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.146299 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.146279 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/secret/8886de5d-d2be-43d1-b7b6-66465c14613b-certificates\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj\" (UID: \"8886de5d-d2be-43d1-b7b6-66465c14613b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.151843 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.151819 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-98x2b\" (UniqueName: \"kubernetes.io/projected/8886de5d-d2be-43d1-b7b6-66465c14613b-kube-api-access-98x2b\") pod \"custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj\" (UID: \"8886de5d-d2be-43d1-b7b6-66465c14613b\") " pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.264139 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.264088 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:07.392211 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.392178 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj"] Apr 23 13:38:07.395886 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:38:07.395858 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod8886de5d_d2be_43d1_b7b6_66465c14613b.slice/crio-f481881eec599e79cbb0d25f1ff82fa26b9ecbc11edaac9d94d1dd51fa61e548 WatchSource:0}: Error finding container f481881eec599e79cbb0d25f1ff82fa26b9ecbc11edaac9d94d1dd51fa61e548: Status 404 returned error can't find the container with id f481881eec599e79cbb0d25f1ff82fa26b9ecbc11edaac9d94d1dd51fa61e548 Apr 23 13:38:07.999957 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:07.999923 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" event={"ID":"8886de5d-d2be-43d1-b7b6-66465c14613b","Type":"ContainerStarted","Data":"f481881eec599e79cbb0d25f1ff82fa26b9ecbc11edaac9d94d1dd51fa61e548"} Apr 23 13:38:13.018243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.018208 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" event={"ID":"8886de5d-d2be-43d1-b7b6-66465c14613b","Type":"ContainerStarted","Data":"deed80e0bc3b57d11b089eb87323edb86eef1149dd1bdfdf9d345eac7bb1d09b"} Apr 23 13:38:13.018627 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.018289 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:13.040928 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.040868 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" podStartSLOduration=2.292871698 podStartE2EDuration="7.040851705s" podCreationTimestamp="2026-04-23 13:38:06 +0000 UTC" firstStartedPulling="2026-04-23 13:38:07.397725522 +0000 UTC m=+408.145727457" lastFinishedPulling="2026-04-23 13:38:12.145705528 +0000 UTC m=+412.893707464" observedRunningTime="2026-04-23 13:38:13.0395798 +0000 UTC m=+413.787581756" watchObservedRunningTime="2026-04-23 13:38:13.040851705 +0000 UTC m=+413.788853661" Apr 23 13:38:13.214297 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.214269 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-keda/keda-admission-cf49989db-zzp7m"] Apr 23 13:38:13.217495 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.217481 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.220023 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.219998 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-admission-webhooks-certs\"" Apr 23 13:38:13.220144 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.220121 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-keda\"/\"keda-ocp-cabundle\"" Apr 23 13:38:13.220356 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.220321 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-keda\"/\"keda-operator-dockercfg-l4gv6\"" Apr 23 13:38:13.227095 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.227075 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zzp7m"] Apr 23 13:38:13.296526 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.296507 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64433f51-c2e9-418d-943e-9ad8e669f8b1-certificates\") pod \"keda-admission-cf49989db-zzp7m\" (UID: \"64433f51-c2e9-418d-943e-9ad8e669f8b1\") " pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.296640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.296551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jr97l\" (UniqueName: \"kubernetes.io/projected/64433f51-c2e9-418d-943e-9ad8e669f8b1-kube-api-access-jr97l\") pod \"keda-admission-cf49989db-zzp7m\" (UID: \"64433f51-c2e9-418d-943e-9ad8e669f8b1\") " pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.397058 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.397031 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jr97l\" (UniqueName: \"kubernetes.io/projected/64433f51-c2e9-418d-943e-9ad8e669f8b1-kube-api-access-jr97l\") pod \"keda-admission-cf49989db-zzp7m\" (UID: \"64433f51-c2e9-418d-943e-9ad8e669f8b1\") " pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.397178 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.397087 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64433f51-c2e9-418d-943e-9ad8e669f8b1-certificates\") pod \"keda-admission-cf49989db-zzp7m\" (UID: \"64433f51-c2e9-418d-943e-9ad8e669f8b1\") " pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.399388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.399368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"certificates\" (UniqueName: \"kubernetes.io/projected/64433f51-c2e9-418d-943e-9ad8e669f8b1-certificates\") pod \"keda-admission-cf49989db-zzp7m\" (UID: \"64433f51-c2e9-418d-943e-9ad8e669f8b1\") " pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.405513 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.405490 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jr97l\" (UniqueName: \"kubernetes.io/projected/64433f51-c2e9-418d-943e-9ad8e669f8b1-kube-api-access-jr97l\") pod \"keda-admission-cf49989db-zzp7m\" (UID: \"64433f51-c2e9-418d-943e-9ad8e669f8b1\") " pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.527872 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.527665 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:13.660397 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:13.660364 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-keda/keda-admission-cf49989db-zzp7m"] Apr 23 13:38:13.664158 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:38:13.664122 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod64433f51_c2e9_418d_943e_9ad8e669f8b1.slice/crio-b1fd140f82366899c8abf23faff805fd3b79773b9369c85d044d495fb564373a WatchSource:0}: Error finding container b1fd140f82366899c8abf23faff805fd3b79773b9369c85d044d495fb564373a: Status 404 returned error can't find the container with id b1fd140f82366899c8abf23faff805fd3b79773b9369c85d044d495fb564373a Apr 23 13:38:14.022525 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:14.022446 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zzp7m" event={"ID":"64433f51-c2e9-418d-943e-9ad8e669f8b1","Type":"ContainerStarted","Data":"b1fd140f82366899c8abf23faff805fd3b79773b9369c85d044d495fb564373a"} Apr 23 13:38:16.030106 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:16.030072 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-keda/keda-admission-cf49989db-zzp7m" event={"ID":"64433f51-c2e9-418d-943e-9ad8e669f8b1","Type":"ContainerStarted","Data":"0bf9a66ed63463b7fba400f3ca37d43ba68df21ac4abfd98455272b8e66d2adb"} Apr 23 13:38:16.030462 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:16.030170 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:38:16.046496 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:16.046455 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-keda/keda-admission-cf49989db-zzp7m" podStartSLOduration=1.4326842929999999 podStartE2EDuration="3.046441098s" podCreationTimestamp="2026-04-23 13:38:13 +0000 UTC" firstStartedPulling="2026-04-23 13:38:13.665725035 +0000 UTC m=+414.413726986" lastFinishedPulling="2026-04-23 13:38:15.279481855 +0000 UTC m=+416.027483791" observedRunningTime="2026-04-23 13:38:16.045419424 +0000 UTC m=+416.793421393" watchObservedRunningTime="2026-04-23 13:38:16.046441098 +0000 UTC m=+416.794443053" Apr 23 13:38:34.024712 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:34.024685 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/custom-metrics-autoscaler-operator-bbf89fd5d-ttlsj" Apr 23 13:38:37.035995 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:38:37.035925 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-keda/keda-admission-cf49989db-zzp7m" Apr 23 13:39:43.484652 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.484623 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["cert-manager/cert-manager-79c8d999ff-4k5n7"] Apr 23 13:39:43.487529 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.487514 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.490450 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.490424 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"cert-manager\"/\"cert-manager-dockercfg-jrqjd\"" Apr 23 13:39:43.490637 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.490625 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"kube-root-ca.crt\"" Apr 23 13:39:43.491478 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.491459 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"cert-manager\"/\"openshift-service-ca.crt\"" Apr 23 13:39:43.502030 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.502005 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-4k5n7"] Apr 23 13:39:43.603421 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.603375 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/841f7975-ab9e-4596-ba6f-bc16f696c9fc-bound-sa-token\") pod \"cert-manager-79c8d999ff-4k5n7\" (UID: \"841f7975-ab9e-4596-ba6f-bc16f696c9fc\") " pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.603421 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.603430 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rrqj8\" (UniqueName: \"kubernetes.io/projected/841f7975-ab9e-4596-ba6f-bc16f696c9fc-kube-api-access-rrqj8\") pod \"cert-manager-79c8d999ff-4k5n7\" (UID: \"841f7975-ab9e-4596-ba6f-bc16f696c9fc\") " pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.704622 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.704591 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/841f7975-ab9e-4596-ba6f-bc16f696c9fc-bound-sa-token\") pod \"cert-manager-79c8d999ff-4k5n7\" (UID: \"841f7975-ab9e-4596-ba6f-bc16f696c9fc\") " pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.704622 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.704636 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rrqj8\" (UniqueName: \"kubernetes.io/projected/841f7975-ab9e-4596-ba6f-bc16f696c9fc-kube-api-access-rrqj8\") pod \"cert-manager-79c8d999ff-4k5n7\" (UID: \"841f7975-ab9e-4596-ba6f-bc16f696c9fc\") " pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.713575 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.713546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"bound-sa-token\" (UniqueName: \"kubernetes.io/projected/841f7975-ab9e-4596-ba6f-bc16f696c9fc-bound-sa-token\") pod \"cert-manager-79c8d999ff-4k5n7\" (UID: \"841f7975-ab9e-4596-ba6f-bc16f696c9fc\") " pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.713724 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.713707 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rrqj8\" (UniqueName: \"kubernetes.io/projected/841f7975-ab9e-4596-ba6f-bc16f696c9fc-kube-api-access-rrqj8\") pod \"cert-manager-79c8d999ff-4k5n7\" (UID: \"841f7975-ab9e-4596-ba6f-bc16f696c9fc\") " pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.806726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.806692 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="cert-manager/cert-manager-79c8d999ff-4k5n7" Apr 23 13:39:43.935778 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:43.935753 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["cert-manager/cert-manager-79c8d999ff-4k5n7"] Apr 23 13:39:43.938060 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:39:43.938032 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod841f7975_ab9e_4596_ba6f_bc16f696c9fc.slice/crio-2badddb6deea919e678699ef7fd88e5af4f2c4fe0affe040fff4a8434838b4f4 WatchSource:0}: Error finding container 2badddb6deea919e678699ef7fd88e5af4f2c4fe0affe040fff4a8434838b4f4: Status 404 returned error can't find the container with id 2badddb6deea919e678699ef7fd88e5af4f2c4fe0affe040fff4a8434838b4f4 Apr 23 13:39:44.299724 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:44.299694 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-4k5n7" event={"ID":"841f7975-ab9e-4596-ba6f-bc16f696c9fc","Type":"ContainerStarted","Data":"2badddb6deea919e678699ef7fd88e5af4f2c4fe0affe040fff4a8434838b4f4"} Apr 23 13:39:47.314136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:47.314057 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="cert-manager/cert-manager-79c8d999ff-4k5n7" event={"ID":"841f7975-ab9e-4596-ba6f-bc16f696c9fc","Type":"ContainerStarted","Data":"74bd176aa9089c8b6ccfa8ffe31d99a451b9ae4957cb0845d5fec4bd8efe248d"} Apr 23 13:39:47.333005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:47.332949 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="cert-manager/cert-manager-79c8d999ff-4k5n7" podStartSLOduration=1.233770963 podStartE2EDuration="4.332932325s" podCreationTimestamp="2026-04-23 13:39:43 +0000 UTC" firstStartedPulling="2026-04-23 13:39:43.939869458 +0000 UTC m=+504.687871392" lastFinishedPulling="2026-04-23 13:39:47.039030818 +0000 UTC m=+507.787032754" observedRunningTime="2026-04-23 13:39:47.331575618 +0000 UTC m=+508.079577571" watchObservedRunningTime="2026-04-23 13:39:47.332932325 +0000 UTC m=+508.080934282" Apr 23 13:39:50.508961 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.508929 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-lws-operator/lws-controller-manager-868f457486-sm2xf"] Apr 23 13:39:50.512316 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.512295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.517611 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.517589 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"kube-root-ca.crt\"" Apr 23 13:39:50.517708 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.517592 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"metrics-server-cert\"" Apr 23 13:39:50.517708 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.517615 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"lws-manager-config\"" Apr 23 13:39:50.517815 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.517708 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-lws-operator\"/\"openshift-service-ca.crt\"" Apr 23 13:39:50.517815 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.517785 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"webhook-server-cert\"" Apr 23 13:39:50.517922 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.517832 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-lws-operator\"/\"lws-controller-manager-dockercfg-bzhcc\"" Apr 23 13:39:50.536268 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.536247 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-868f457486-sm2xf"] Apr 23 13:39:50.558369 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.558325 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/760155eb-85e1-4817-84fd-7dcc3f8f3c54-manager-config\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.558471 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.558393 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/760155eb-85e1-4817-84fd-7dcc3f8f3c54-cert\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.558471 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.558435 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/760155eb-85e1-4817-84fd-7dcc3f8f3c54-metrics-cert\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.558471 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.558455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-vkj2c\" (UniqueName: \"kubernetes.io/projected/760155eb-85e1-4817-84fd-7dcc3f8f3c54-kube-api-access-vkj2c\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.659237 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.659210 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/760155eb-85e1-4817-84fd-7dcc3f8f3c54-cert\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.659369 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.659254 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/760155eb-85e1-4817-84fd-7dcc3f8f3c54-metrics-cert\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.659369 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.659275 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-vkj2c\" (UniqueName: \"kubernetes.io/projected/760155eb-85e1-4817-84fd-7dcc3f8f3c54-kube-api-access-vkj2c\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.659472 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.659359 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/760155eb-85e1-4817-84fd-7dcc3f8f3c54-manager-config\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.659938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.659920 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"manager-config\" (UniqueName: \"kubernetes.io/configmap/760155eb-85e1-4817-84fd-7dcc3f8f3c54-manager-config\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.661607 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.661587 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"metrics-cert\" (UniqueName: \"kubernetes.io/secret/760155eb-85e1-4817-84fd-7dcc3f8f3c54-metrics-cert\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.661722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.661706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/760155eb-85e1-4817-84fd-7dcc3f8f3c54-cert\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.668728 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.668704 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-vkj2c\" (UniqueName: \"kubernetes.io/projected/760155eb-85e1-4817-84fd-7dcc3f8f3c54-kube-api-access-vkj2c\") pod \"lws-controller-manager-868f457486-sm2xf\" (UID: \"760155eb-85e1-4817-84fd-7dcc3f8f3c54\") " pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.821081 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.821051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:50.961003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:50.960974 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-lws-operator/lws-controller-manager-868f457486-sm2xf"] Apr 23 13:39:50.963614 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:39:50.963589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod760155eb_85e1_4817_84fd_7dcc3f8f3c54.slice/crio-f8001b5be390f0a061b114c29d9ace80ff91335688eab243334ce1145adcad6e WatchSource:0}: Error finding container f8001b5be390f0a061b114c29d9ace80ff91335688eab243334ce1145adcad6e: Status 404 returned error can't find the container with id f8001b5be390f0a061b114c29d9ace80ff91335688eab243334ce1145adcad6e Apr 23 13:39:51.328725 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:51.328629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" event={"ID":"760155eb-85e1-4817-84fd-7dcc3f8f3c54","Type":"ContainerStarted","Data":"f8001b5be390f0a061b114c29d9ace80ff91335688eab243334ce1145adcad6e"} Apr 23 13:39:53.336929 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:53.336893 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" event={"ID":"760155eb-85e1-4817-84fd-7dcc3f8f3c54","Type":"ContainerStarted","Data":"6ab7387f26cb366be53bd5c0479f4100b2e703b77e28d1f1faa4a6bc97348cf5"} Apr 23 13:39:53.337280 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:53.336961 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:39:53.361141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:39:53.361095 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" podStartSLOduration=1.385903569 podStartE2EDuration="3.361083047s" podCreationTimestamp="2026-04-23 13:39:50 +0000 UTC" firstStartedPulling="2026-04-23 13:39:50.965735595 +0000 UTC m=+511.713737528" lastFinishedPulling="2026-04-23 13:39:52.940915072 +0000 UTC m=+513.688917006" observedRunningTime="2026-04-23 13:39:53.359627904 +0000 UTC m=+514.107629859" watchObservedRunningTime="2026-04-23 13:39:53.361083047 +0000 UTC m=+514.109085059" Apr 23 13:40:04.342424 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:04.342392 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-lws-operator/lws-controller-manager-868f457486-sm2xf" Apr 23 13:40:47.696725 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.696696 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-console/console-59cf858f47-w8wxg"] Apr 23 13:40:47.700254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.700228 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.718137 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.718116 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59cf858f47-w8wxg"] Apr 23 13:40:47.775310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-service-ca\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.775310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775311 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-config\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.775482 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-oauth-serving-cert\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.775482 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775444 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lbs97\" (UniqueName: \"kubernetes.io/projected/66fad32d-5822-402d-87d1-dc8fe7cae6be-kube-api-access-lbs97\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.775564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775485 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-trusted-ca-bundle\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.775564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775510 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-serving-cert\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.775564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.775540 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-oauth-config\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876249 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876221 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-trusted-ca-bundle\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876414 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876257 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-serving-cert\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876414 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876281 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-oauth-config\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876414 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876317 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-service-ca\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876414 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876369 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-config\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876630 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876438 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-oauth-serving-cert\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.876630 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.876494 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lbs97\" (UniqueName: \"kubernetes.io/projected/66fad32d-5822-402d-87d1-dc8fe7cae6be-kube-api-access-lbs97\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.877141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.877113 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-service-ca\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.877241 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.877141 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-config\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.877241 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.877217 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-oauth-serving-cert\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.877241 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.877234 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/66fad32d-5822-402d-87d1-dc8fe7cae6be-trusted-ca-bundle\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.878746 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.878725 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-oauth-config\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.878827 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.878766 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/66fad32d-5822-402d-87d1-dc8fe7cae6be-console-serving-cert\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:47.887340 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:47.887307 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lbs97\" (UniqueName: \"kubernetes.io/projected/66fad32d-5822-402d-87d1-dc8fe7cae6be-kube-api-access-lbs97\") pod \"console-59cf858f47-w8wxg\" (UID: \"66fad32d-5822-402d-87d1-dc8fe7cae6be\") " pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:48.010279 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:48.010226 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:48.139383 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:48.139359 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-console/console-59cf858f47-w8wxg"] Apr 23 13:40:48.517064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:48.517027 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59cf858f47-w8wxg" event={"ID":"66fad32d-5822-402d-87d1-dc8fe7cae6be","Type":"ContainerStarted","Data":"128ec3de44104c4c81db8233aacb438f90cfe77a7f0201ffb77ec9cab5d884f4"} Apr 23 13:40:48.517064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:48.517068 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-59cf858f47-w8wxg" event={"ID":"66fad32d-5822-402d-87d1-dc8fe7cae6be","Type":"ContainerStarted","Data":"8de4beaced4d36816fc735b2159c25b754285039d469f004a0762e26eb34ccf9"} Apr 23 13:40:48.563508 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:48.563462 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-console/console-59cf858f47-w8wxg" podStartSLOduration=1.5634488260000001 podStartE2EDuration="1.563448826s" podCreationTimestamp="2026-04-23 13:40:47 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:40:48.561580515 +0000 UTC m=+569.309582480" watchObservedRunningTime="2026-04-23 13:40:48.563448826 +0000 UTC m=+569.311450780" Apr 23 13:40:54.127787 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.127757 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn"] Apr 23 13:40:54.131164 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.131146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.134064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.134037 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kube-root-ca.crt\"" Apr 23 13:40:54.135155 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.135141 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"openshift-service-ca.crt\"" Apr 23 13:40:54.135254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.135165 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"plugin-serving-cert\"" Apr 23 13:40:54.135254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.135178 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"default-dockercfg-g7kfd\"" Apr 23 13:40:54.135254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.135177 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kuadrant-system\"/\"kuadrant-console-nginx-conf\"" Apr 23 13:40:54.147944 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.147924 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn"] Apr 23 13:40:54.221109 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.221084 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5ddb56b-5f4c-4c85-a105-94515bb001c1-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.221228 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.221127 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ddb56b-5f4c-4c85-a105-94515bb001c1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.221228 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.221213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-69bck\" (UniqueName: \"kubernetes.io/projected/a5ddb56b-5f4c-4c85-a105-94515bb001c1-kube-api-access-69bck\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.321886 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.321862 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5ddb56b-5f4c-4c85-a105-94515bb001c1-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.322020 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.321895 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ddb56b-5f4c-4c85-a105-94515bb001c1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.322020 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.321937 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-69bck\" (UniqueName: \"kubernetes.io/projected/a5ddb56b-5f4c-4c85-a105-94515bb001c1-kube-api-access-69bck\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.322542 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.322519 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"nginx-conf\" (UniqueName: \"kubernetes.io/configmap/a5ddb56b-5f4c-4c85-a105-94515bb001c1-nginx-conf\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.324215 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.324197 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"plugin-serving-cert\" (UniqueName: \"kubernetes.io/secret/a5ddb56b-5f4c-4c85-a105-94515bb001c1-plugin-serving-cert\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.333470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.333446 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-69bck\" (UniqueName: \"kubernetes.io/projected/a5ddb56b-5f4c-4c85-a105-94515bb001c1-kube-api-access-69bck\") pod \"kuadrant-console-plugin-6c886788f8-shkqn\" (UID: \"a5ddb56b-5f4c-4c85-a105-94515bb001c1\") " pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.441069 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.440999 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" Apr 23 13:40:54.560599 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:54.560559 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn"] Apr 23 13:40:54.563020 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:40:54.562997 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-poda5ddb56b_5f4c_4c85_a105_94515bb001c1.slice/crio-6e144ba9dab5ebcbcb9430798723951b209209d3a1a037f7b8f40fb17d25192a WatchSource:0}: Error finding container 6e144ba9dab5ebcbcb9430798723951b209209d3a1a037f7b8f40fb17d25192a: Status 404 returned error can't find the container with id 6e144ba9dab5ebcbcb9430798723951b209209d3a1a037f7b8f40fb17d25192a Apr 23 13:40:55.540511 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:55.540480 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" event={"ID":"a5ddb56b-5f4c-4c85-a105-94515bb001c1","Type":"ContainerStarted","Data":"6e144ba9dab5ebcbcb9430798723951b209209d3a1a037f7b8f40fb17d25192a"} Apr 23 13:40:58.011305 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:58.011270 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:58.011305 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:58.011314 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:58.016396 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:58.016374 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:58.554045 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:58.554020 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-console/console-59cf858f47-w8wxg" Apr 23 13:40:58.619380 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:40:58.619352 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69f4894dbd-dzqvs"] Apr 23 13:41:00.558544 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:00.558506 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" event={"ID":"a5ddb56b-5f4c-4c85-a105-94515bb001c1","Type":"ContainerStarted","Data":"57cecf35028ca109e9679ea3cd3d6caf0eaa2feab993ffd1b6e9ae6d4ae46718"} Apr 23 13:41:00.575054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:00.575007 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/kuadrant-console-plugin-6c886788f8-shkqn" podStartSLOduration=1.303108504 podStartE2EDuration="6.57499251s" podCreationTimestamp="2026-04-23 13:40:54 +0000 UTC" firstStartedPulling="2026-04-23 13:40:54.564218811 +0000 UTC m=+575.312220744" lastFinishedPulling="2026-04-23 13:40:59.836102803 +0000 UTC m=+580.584104750" observedRunningTime="2026-04-23 13:41:00.573359191 +0000 UTC m=+581.321361139" watchObservedRunningTime="2026-04-23 13:41:00.57499251 +0000 UTC m=+581.322994465" Apr 23 13:41:19.730637 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:19.730606 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:41:19.731038 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:19.730857 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:41:23.644804 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.644769 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-console/console-69f4894dbd-dzqvs" podUID="1b3474d0-5402-44e4-b9b8-11c8a00bf034" containerName="console" containerID="cri-o://da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd" gracePeriod=15 Apr 23 13:41:23.874127 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.874107 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f4894dbd-dzqvs_1b3474d0-5402-44e4-b9b8-11c8a00bf034/console/0.log" Apr 23 13:41:23.874226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.874164 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:41:23.950773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950702 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-oauth-serving-cert\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.950773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950755 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-config\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.950773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950776 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-trusted-ca-bundle\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.951000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950814 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-oauth-config\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.951000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950833 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-serving-cert\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.951000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950857 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-service-ca\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.951000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.950894 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/1b3474d0-5402-44e4-b9b8-11c8a00bf034-kube-api-access-gktd8\") pod \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\" (UID: \"1b3474d0-5402-44e4-b9b8-11c8a00bf034\") " Apr 23 13:41:23.951205 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.951084 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-oauth-serving-cert" (OuterVolumeSpecName: "oauth-serving-cert") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "oauth-serving-cert". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:41:23.951264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.951240 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-config" (OuterVolumeSpecName: "console-config") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "console-config". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:41:23.951305 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.951257 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-service-ca" (OuterVolumeSpecName: "service-ca") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "service-ca". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:41:23.951415 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.951390 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-trusted-ca-bundle" (OuterVolumeSpecName: "trusted-ca-bundle") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "trusted-ca-bundle". PluginName "kubernetes.io/configmap", VolumeGIDValue "" Apr 23 13:41:23.953065 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.953039 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/1b3474d0-5402-44e4-b9b8-11c8a00bf034-kube-api-access-gktd8" (OuterVolumeSpecName: "kube-api-access-gktd8") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "kube-api-access-gktd8". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:41:23.953065 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.953043 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-oauth-config" (OuterVolumeSpecName: "console-oauth-config") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "console-oauth-config". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:41:23.953194 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:23.953117 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-serving-cert" (OuterVolumeSpecName: "console-serving-cert") pod "1b3474d0-5402-44e4-b9b8-11c8a00bf034" (UID: "1b3474d0-5402-44e4-b9b8-11c8a00bf034"). InnerVolumeSpecName "console-serving-cert". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:41:24.051540 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051517 2576 reconciler_common.go:299] "Volume detached for volume \"oauth-serving-cert\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-oauth-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.051540 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051538 2576 reconciler_common.go:299] "Volume detached for volume \"console-config\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.051647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051547 2576 reconciler_common.go:299] "Volume detached for volume \"trusted-ca-bundle\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-trusted-ca-bundle\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.051647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051556 2576 reconciler_common.go:299] "Volume detached for volume \"console-oauth-config\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-oauth-config\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.051647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051564 2576 reconciler_common.go:299] "Volume detached for volume \"console-serving-cert\" (UniqueName: \"kubernetes.io/secret/1b3474d0-5402-44e4-b9b8-11c8a00bf034-console-serving-cert\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.051647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051573 2576 reconciler_common.go:299] "Volume detached for volume \"service-ca\" (UniqueName: \"kubernetes.io/configmap/1b3474d0-5402-44e4-b9b8-11c8a00bf034-service-ca\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.051647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.051581 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gktd8\" (UniqueName: \"kubernetes.io/projected/1b3474d0-5402-44e4-b9b8-11c8a00bf034-kube-api-access-gktd8\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:24.639596 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.639568 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-69f4894dbd-dzqvs_1b3474d0-5402-44e4-b9b8-11c8a00bf034/console/0.log" Apr 23 13:41:24.639776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.639608 2576 generic.go:358] "Generic (PLEG): container finished" podID="1b3474d0-5402-44e4-b9b8-11c8a00bf034" containerID="da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd" exitCode=2 Apr 23 13:41:24.639776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.639680 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-console/console-69f4894dbd-dzqvs" Apr 23 13:41:24.639776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.639693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f4894dbd-dzqvs" event={"ID":"1b3474d0-5402-44e4-b9b8-11c8a00bf034","Type":"ContainerDied","Data":"da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd"} Apr 23 13:41:24.639776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.639729 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-console/console-69f4894dbd-dzqvs" event={"ID":"1b3474d0-5402-44e4-b9b8-11c8a00bf034","Type":"ContainerDied","Data":"7638ba4b8a1ef42ebb7b5ec8144909ddccf0c118e3c7e555a3d0642578b86e2a"} Apr 23 13:41:24.639776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.639744 2576 scope.go:117] "RemoveContainer" containerID="da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd" Apr 23 13:41:24.648157 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.648034 2576 scope.go:117] "RemoveContainer" containerID="da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd" Apr 23 13:41:24.648370 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:41:24.648266 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd\": container with ID starting with da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd not found: ID does not exist" containerID="da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd" Apr 23 13:41:24.648370 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.648286 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd"} err="failed to get container status \"da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd\": rpc error: code = NotFound desc = could not find container \"da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd\": container with ID starting with da5b1f6f23f018a713d0ee28160af587fde584da55cabc502ba73a45578646bd not found: ID does not exist" Apr 23 13:41:24.662464 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.662442 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-console/console-69f4894dbd-dzqvs"] Apr 23 13:41:24.665967 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:24.665948 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-console/console-69f4894dbd-dzqvs"] Apr 23 13:41:25.825058 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:25.825028 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="1b3474d0-5402-44e4-b9b8-11c8a00bf034" path="/var/lib/kubelet/pods/1b3474d0-5402-44e4-b9b8-11c8a00bf034/volumes" Apr 23 13:41:39.456236 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.456154 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-674b59b84c-dgtsn"] Apr 23 13:41:39.456679 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.456493 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="1b3474d0-5402-44e4-b9b8-11c8a00bf034" containerName="console" Apr 23 13:41:39.456679 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.456503 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="1b3474d0-5402-44e4-b9b8-11c8a00bf034" containerName="console" Apr 23 13:41:39.456679 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.456559 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="1b3474d0-5402-44e4-b9b8-11c8a00bf034" containerName="console" Apr 23 13:41:39.458273 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.458257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:39.461165 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.461137 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-authorino-dockercfg-g79gj\"" Apr 23 13:41:39.467839 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.467818 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dgtsn"] Apr 23 13:41:39.564907 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.564880 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-cv7kx\" (UniqueName: \"kubernetes.io/projected/254a691f-5602-41bb-b16c-e2dfc8444b98-kube-api-access-cv7kx\") pod \"authorino-674b59b84c-dgtsn\" (UID: \"254a691f-5602-41bb-b16c-e2dfc8444b98\") " pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:39.666057 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.666024 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-cv7kx\" (UniqueName: \"kubernetes.io/projected/254a691f-5602-41bb-b16c-e2dfc8444b98-kube-api-access-cv7kx\") pod \"authorino-674b59b84c-dgtsn\" (UID: \"254a691f-5602-41bb-b16c-e2dfc8444b98\") " pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:39.674754 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.674730 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-79cbc94b89-464ld"] Apr 23 13:41:39.676934 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.676915 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:41:39.680723 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.680706 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-cv7kx\" (UniqueName: \"kubernetes.io/projected/254a691f-5602-41bb-b16c-e2dfc8444b98-kube-api-access-cv7kx\") pod \"authorino-674b59b84c-dgtsn\" (UID: \"254a691f-5602-41bb-b16c-e2dfc8444b98\") " pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:39.689867 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.689846 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-464ld"] Apr 23 13:41:39.766906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.766832 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7fgf5\" (UniqueName: \"kubernetes.io/projected/2d76d8e2-df99-447b-b517-0faf66b445b0-kube-api-access-7fgf5\") pod \"authorino-79cbc94b89-464ld\" (UID: \"2d76d8e2-df99-447b-b517-0faf66b445b0\") " pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:41:39.767686 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.767667 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:39.867719 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.867690 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7fgf5\" (UniqueName: \"kubernetes.io/projected/2d76d8e2-df99-447b-b517-0faf66b445b0-kube-api-access-7fgf5\") pod \"authorino-79cbc94b89-464ld\" (UID: \"2d76d8e2-df99-447b-b517-0faf66b445b0\") " pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:41:39.877395 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.877368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7fgf5\" (UniqueName: \"kubernetes.io/projected/2d76d8e2-df99-447b-b517-0faf66b445b0-kube-api-access-7fgf5\") pod \"authorino-79cbc94b89-464ld\" (UID: \"2d76d8e2-df99-447b-b517-0faf66b445b0\") " pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:41:39.884162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.884098 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dgtsn"] Apr 23 13:41:39.887245 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:41:39.887221 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod254a691f_5602_41bb_b16c_e2dfc8444b98.slice/crio-4792ed060d31fb39602f4af2835a9c01969ae323ac51e8f01fa2b03e68a57da2 WatchSource:0}: Error finding container 4792ed060d31fb39602f4af2835a9c01969ae323ac51e8f01fa2b03e68a57da2: Status 404 returned error can't find the container with id 4792ed060d31fb39602f4af2835a9c01969ae323ac51e8f01fa2b03e68a57da2 Apr 23 13:41:39.993013 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:39.992987 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:41:40.105354 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:40.105317 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-464ld"] Apr 23 13:41:40.107598 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:41:40.107574 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod2d76d8e2_df99_447b_b517_0faf66b445b0.slice/crio-13c8a7d3d1cab06b154558499350d330935fee989cf9c135028cc620c93dd4f5 WatchSource:0}: Error finding container 13c8a7d3d1cab06b154558499350d330935fee989cf9c135028cc620c93dd4f5: Status 404 returned error can't find the container with id 13c8a7d3d1cab06b154558499350d330935fee989cf9c135028cc620c93dd4f5 Apr 23 13:41:40.701977 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:40.701926 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dgtsn" event={"ID":"254a691f-5602-41bb-b16c-e2dfc8444b98","Type":"ContainerStarted","Data":"4792ed060d31fb39602f4af2835a9c01969ae323ac51e8f01fa2b03e68a57da2"} Apr 23 13:41:40.703763 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:40.703733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-464ld" event={"ID":"2d76d8e2-df99-447b-b517-0faf66b445b0","Type":"ContainerStarted","Data":"13c8a7d3d1cab06b154558499350d330935fee989cf9c135028cc620c93dd4f5"} Apr 23 13:41:42.711737 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:42.711688 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dgtsn" event={"ID":"254a691f-5602-41bb-b16c-e2dfc8444b98","Type":"ContainerStarted","Data":"f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f"} Apr 23 13:41:42.713026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:42.713005 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-464ld" event={"ID":"2d76d8e2-df99-447b-b517-0faf66b445b0","Type":"ContainerStarted","Data":"fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7"} Apr 23 13:41:42.728627 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:42.728585 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-674b59b84c-dgtsn" podStartSLOduration=1.10215326 podStartE2EDuration="3.728571721s" podCreationTimestamp="2026-04-23 13:41:39 +0000 UTC" firstStartedPulling="2026-04-23 13:41:39.888793237 +0000 UTC m=+620.636795171" lastFinishedPulling="2026-04-23 13:41:42.515211698 +0000 UTC m=+623.263213632" observedRunningTime="2026-04-23 13:41:42.726639725 +0000 UTC m=+623.474641679" watchObservedRunningTime="2026-04-23 13:41:42.728571721 +0000 UTC m=+623.476573675" Apr 23 13:41:42.746818 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:42.746776 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-79cbc94b89-464ld" podStartSLOduration=1.335450349 podStartE2EDuration="3.746764718s" podCreationTimestamp="2026-04-23 13:41:39 +0000 UTC" firstStartedPulling="2026-04-23 13:41:40.108964683 +0000 UTC m=+620.856966616" lastFinishedPulling="2026-04-23 13:41:42.52027905 +0000 UTC m=+623.268280985" observedRunningTime="2026-04-23 13:41:42.745252411 +0000 UTC m=+623.493254378" watchObservedRunningTime="2026-04-23 13:41:42.746764718 +0000 UTC m=+623.494766673" Apr 23 13:41:42.772967 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:42.772895 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dgtsn"] Apr 23 13:41:44.724763 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:44.724702 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-674b59b84c-dgtsn" podUID="254a691f-5602-41bb-b16c-e2dfc8444b98" containerName="authorino" containerID="cri-o://f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f" gracePeriod=30 Apr 23 13:41:44.964493 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:44.964468 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:45.115512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.115482 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-cv7kx\" (UniqueName: \"kubernetes.io/projected/254a691f-5602-41bb-b16c-e2dfc8444b98-kube-api-access-cv7kx\") pod \"254a691f-5602-41bb-b16c-e2dfc8444b98\" (UID: \"254a691f-5602-41bb-b16c-e2dfc8444b98\") " Apr 23 13:41:45.117707 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.117675 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/254a691f-5602-41bb-b16c-e2dfc8444b98-kube-api-access-cv7kx" (OuterVolumeSpecName: "kube-api-access-cv7kx") pod "254a691f-5602-41bb-b16c-e2dfc8444b98" (UID: "254a691f-5602-41bb-b16c-e2dfc8444b98"). InnerVolumeSpecName "kube-api-access-cv7kx". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:41:45.216928 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.216897 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-cv7kx\" (UniqueName: \"kubernetes.io/projected/254a691f-5602-41bb-b16c-e2dfc8444b98-kube-api-access-cv7kx\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:41:45.729072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.729039 2576 generic.go:358] "Generic (PLEG): container finished" podID="254a691f-5602-41bb-b16c-e2dfc8444b98" containerID="f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f" exitCode=0 Apr 23 13:41:45.729536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.729076 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dgtsn" event={"ID":"254a691f-5602-41bb-b16c-e2dfc8444b98","Type":"ContainerDied","Data":"f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f"} Apr 23 13:41:45.729536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.729092 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-674b59b84c-dgtsn" Apr 23 13:41:45.729536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.729109 2576 scope.go:117] "RemoveContainer" containerID="f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f" Apr 23 13:41:45.729536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.729100 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-674b59b84c-dgtsn" event={"ID":"254a691f-5602-41bb-b16c-e2dfc8444b98","Type":"ContainerDied","Data":"4792ed060d31fb39602f4af2835a9c01969ae323ac51e8f01fa2b03e68a57da2"} Apr 23 13:41:45.736993 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.736977 2576 scope.go:117] "RemoveContainer" containerID="f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f" Apr 23 13:41:45.737203 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:41:45.737189 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f\": container with ID starting with f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f not found: ID does not exist" containerID="f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f" Apr 23 13:41:45.737238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.737210 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f"} err="failed to get container status \"f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f\": rpc error: code = NotFound desc = could not find container \"f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f\": container with ID starting with f48c1c2538fa1f7b2e941a87d2dca9fa33d2ffbf539adc473012e0c4f8d3b00f not found: ID does not exist" Apr 23 13:41:45.749042 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.749016 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dgtsn"] Apr 23 13:41:45.753026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.753008 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-674b59b84c-dgtsn"] Apr 23 13:41:45.825072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:41:45.825047 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="254a691f-5602-41bb-b16c-e2dfc8444b98" path="/var/lib/kubelet/pods/254a691f-5602-41bb-b16c-e2dfc8444b98/volumes" Apr 23 13:42:04.727026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.726992 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kuadrant-system/authorino-68bd676465-gkm9l"] Apr 23 13:42:04.728476 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.727315 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="254a691f-5602-41bb-b16c-e2dfc8444b98" containerName="authorino" Apr 23 13:42:04.728476 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.727349 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="254a691f-5602-41bb-b16c-e2dfc8444b98" containerName="authorino" Apr 23 13:42:04.728476 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.727406 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="254a691f-5602-41bb-b16c-e2dfc8444b98" containerName="authorino" Apr 23 13:42:04.729000 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.728985 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:04.731654 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.731636 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kuadrant-system\"/\"authorino-server-cert\"" Apr 23 13:42:04.736297 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.736274 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-gkm9l"] Apr 23 13:42:04.874948 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.874913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-jh9ws\" (UniqueName: \"kubernetes.io/projected/8436e6df-a753-453a-8e1c-829c2637e784-kube-api-access-jh9ws\") pod \"authorino-68bd676465-gkm9l\" (UID: \"8436e6df-a753-453a-8e1c-829c2637e784\") " pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:04.875123 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.875013 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8436e6df-a753-453a-8e1c-829c2637e784-tls-cert\") pod \"authorino-68bd676465-gkm9l\" (UID: \"8436e6df-a753-453a-8e1c-829c2637e784\") " pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:04.975847 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.975817 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-jh9ws\" (UniqueName: \"kubernetes.io/projected/8436e6df-a753-453a-8e1c-829c2637e784-kube-api-access-jh9ws\") pod \"authorino-68bd676465-gkm9l\" (UID: \"8436e6df-a753-453a-8e1c-829c2637e784\") " pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:04.976024 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.975867 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8436e6df-a753-453a-8e1c-829c2637e784-tls-cert\") pod \"authorino-68bd676465-gkm9l\" (UID: \"8436e6df-a753-453a-8e1c-829c2637e784\") " pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:04.978118 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.978068 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-cert\" (UniqueName: \"kubernetes.io/secret/8436e6df-a753-453a-8e1c-829c2637e784-tls-cert\") pod \"authorino-68bd676465-gkm9l\" (UID: \"8436e6df-a753-453a-8e1c-829c2637e784\") " pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:04.985750 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:04.985732 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-jh9ws\" (UniqueName: \"kubernetes.io/projected/8436e6df-a753-453a-8e1c-829c2637e784-kube-api-access-jh9ws\") pod \"authorino-68bd676465-gkm9l\" (UID: \"8436e6df-a753-453a-8e1c-829c2637e784\") " pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:05.037588 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:05.037550 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-68bd676465-gkm9l" Apr 23 13:42:05.156448 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:05.156427 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kuadrant-system/authorino-68bd676465-gkm9l"] Apr 23 13:42:05.158430 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:42:05.158398 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod8436e6df_a753_453a_8e1c_829c2637e784.slice/crio-0d41f836658cbe7d5d3ae3e2b4fbce4942dd66a31de2afa40b9ffec8cd389d78 WatchSource:0}: Error finding container 0d41f836658cbe7d5d3ae3e2b4fbce4942dd66a31de2afa40b9ffec8cd389d78: Status 404 returned error can't find the container with id 0d41f836658cbe7d5d3ae3e2b4fbce4942dd66a31de2afa40b9ffec8cd389d78 Apr 23 13:42:05.798985 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:05.798955 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-gkm9l" event={"ID":"8436e6df-a753-453a-8e1c-829c2637e784","Type":"ContainerStarted","Data":"0d41f836658cbe7d5d3ae3e2b4fbce4942dd66a31de2afa40b9ffec8cd389d78"} Apr 23 13:42:06.803162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:06.803124 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-68bd676465-gkm9l" event={"ID":"8436e6df-a753-453a-8e1c-829c2637e784","Type":"ContainerStarted","Data":"1ed83163012c0fef25279b6f195f509ec14aadf0875a991216d62f5a8001ab56"} Apr 23 13:42:06.818161 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:06.818122 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kuadrant-system/authorino-68bd676465-gkm9l" podStartSLOduration=1.9391321160000001 podStartE2EDuration="2.818108938s" podCreationTimestamp="2026-04-23 13:42:04 +0000 UTC" firstStartedPulling="2026-04-23 13:42:05.159658354 +0000 UTC m=+645.907660287" lastFinishedPulling="2026-04-23 13:42:06.038635161 +0000 UTC m=+646.786637109" observedRunningTime="2026-04-23 13:42:06.817689049 +0000 UTC m=+647.565691004" watchObservedRunningTime="2026-04-23 13:42:06.818108938 +0000 UTC m=+647.566110893" Apr 23 13:42:06.842365 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:06.842314 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-464ld"] Apr 23 13:42:06.842580 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:06.842558 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kuadrant-system/authorino-79cbc94b89-464ld" podUID="2d76d8e2-df99-447b-b517-0faf66b445b0" containerName="authorino" containerID="cri-o://fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7" gracePeriod=30 Apr 23 13:42:07.092670 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.092642 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:42:07.194142 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.194112 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7fgf5\" (UniqueName: \"kubernetes.io/projected/2d76d8e2-df99-447b-b517-0faf66b445b0-kube-api-access-7fgf5\") pod \"2d76d8e2-df99-447b-b517-0faf66b445b0\" (UID: \"2d76d8e2-df99-447b-b517-0faf66b445b0\") " Apr 23 13:42:07.196087 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.196057 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2d76d8e2-df99-447b-b517-0faf66b445b0-kube-api-access-7fgf5" (OuterVolumeSpecName: "kube-api-access-7fgf5") pod "2d76d8e2-df99-447b-b517-0faf66b445b0" (UID: "2d76d8e2-df99-447b-b517-0faf66b445b0"). InnerVolumeSpecName "kube-api-access-7fgf5". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:42:07.295639 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.295604 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7fgf5\" (UniqueName: \"kubernetes.io/projected/2d76d8e2-df99-447b-b517-0faf66b445b0-kube-api-access-7fgf5\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:42:07.806979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.806941 2576 generic.go:358] "Generic (PLEG): container finished" podID="2d76d8e2-df99-447b-b517-0faf66b445b0" containerID="fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7" exitCode=0 Apr 23 13:42:07.807400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.806990 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kuadrant-system/authorino-79cbc94b89-464ld" Apr 23 13:42:07.807400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.807020 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-464ld" event={"ID":"2d76d8e2-df99-447b-b517-0faf66b445b0","Type":"ContainerDied","Data":"fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7"} Apr 23 13:42:07.807400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.807052 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kuadrant-system/authorino-79cbc94b89-464ld" event={"ID":"2d76d8e2-df99-447b-b517-0faf66b445b0","Type":"ContainerDied","Data":"13c8a7d3d1cab06b154558499350d330935fee989cf9c135028cc620c93dd4f5"} Apr 23 13:42:07.807400 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.807067 2576 scope.go:117] "RemoveContainer" containerID="fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7" Apr 23 13:42:07.815383 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.815325 2576 scope.go:117] "RemoveContainer" containerID="fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7" Apr 23 13:42:07.815593 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:42:07.815578 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7\": container with ID starting with fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7 not found: ID does not exist" containerID="fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7" Apr 23 13:42:07.815634 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.815602 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7"} err="failed to get container status \"fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7\": rpc error: code = NotFound desc = could not find container \"fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7\": container with ID starting with fdf493d87349f86339ecd256363d8ac62b0b7d40cf85163a9d82f644ac3ba9e7 not found: ID does not exist" Apr 23 13:42:07.827167 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.827143 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-464ld"] Apr 23 13:42:07.830780 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:07.830761 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kuadrant-system/authorino-79cbc94b89-464ld"] Apr 23 13:42:09.825407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:09.825377 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2d76d8e2-df99-447b-b517-0faf66b445b0" path="/var/lib/kubelet/pods/2d76d8e2-df99-447b-b517-0faf66b445b0/volumes" Apr 23 13:42:24.148154 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.148122 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/seaweedfs-86cc847c5c-nxr6l"] Apr 23 13:42:24.149121 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.149092 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2d76d8e2-df99-447b-b517-0faf66b445b0" containerName="authorino" Apr 23 13:42:24.149121 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.149119 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2d76d8e2-df99-447b-b517-0faf66b445b0" containerName="authorino" Apr 23 13:42:24.149258 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.149196 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2d76d8e2-df99-447b-b517-0faf66b445b0" containerName="authorino" Apr 23 13:42:24.153814 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.153794 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.156961 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.156938 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"mlpipeline-s3-artifact\"" Apr 23 13:42:24.157158 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.156946 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"default-dockercfg-f84kp\"" Apr 23 13:42:24.157158 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.156946 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"openshift-service-ca.crt\"" Apr 23 13:42:24.157312 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.156999 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve\"/\"kube-root-ca.crt\"" Apr 23 13:42:24.160374 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.160324 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-nxr6l"] Apr 23 13:42:24.216484 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.216453 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2f643ad5-5f62-4432-bb1f-e8cdee3a5040-data\") pod \"seaweedfs-86cc847c5c-nxr6l\" (UID: \"2f643ad5-5f62-4432-bb1f-e8cdee3a5040\") " pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.216610 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.216519 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-fncsf\" (UniqueName: \"kubernetes.io/projected/2f643ad5-5f62-4432-bb1f-e8cdee3a5040-kube-api-access-fncsf\") pod \"seaweedfs-86cc847c5c-nxr6l\" (UID: \"2f643ad5-5f62-4432-bb1f-e8cdee3a5040\") " pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.317885 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.317858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2f643ad5-5f62-4432-bb1f-e8cdee3a5040-data\") pod \"seaweedfs-86cc847c5c-nxr6l\" (UID: \"2f643ad5-5f62-4432-bb1f-e8cdee3a5040\") " pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.318019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.317902 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-fncsf\" (UniqueName: \"kubernetes.io/projected/2f643ad5-5f62-4432-bb1f-e8cdee3a5040-kube-api-access-fncsf\") pod \"seaweedfs-86cc847c5c-nxr6l\" (UID: \"2f643ad5-5f62-4432-bb1f-e8cdee3a5040\") " pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.318203 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.318184 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"data\" (UniqueName: \"kubernetes.io/empty-dir/2f643ad5-5f62-4432-bb1f-e8cdee3a5040-data\") pod \"seaweedfs-86cc847c5c-nxr6l\" (UID: \"2f643ad5-5f62-4432-bb1f-e8cdee3a5040\") " pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.327122 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.327099 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-fncsf\" (UniqueName: \"kubernetes.io/projected/2f643ad5-5f62-4432-bb1f-e8cdee3a5040-kube-api-access-fncsf\") pod \"seaweedfs-86cc847c5c-nxr6l\" (UID: \"2f643ad5-5f62-4432-bb1f-e8cdee3a5040\") " pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.464251 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.464172 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:24.579726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.579683 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/seaweedfs-86cc847c5c-nxr6l"] Apr 23 13:42:24.582969 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:42:24.582937 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2f643ad5_5f62_4432_bb1f_e8cdee3a5040.slice/crio-d1aed674225e7cb0b453c343264d34f61430f975d161a3f5e0f24b78dbb1a5d7 WatchSource:0}: Error finding container d1aed674225e7cb0b453c343264d34f61430f975d161a3f5e0f24b78dbb1a5d7: Status 404 returned error can't find the container with id d1aed674225e7cb0b453c343264d34f61430f975d161a3f5e0f24b78dbb1a5d7 Apr 23 13:42:24.584258 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.584242 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:42:24.864685 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:24.864659 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-nxr6l" event={"ID":"2f643ad5-5f62-4432-bb1f-e8cdee3a5040","Type":"ContainerStarted","Data":"d1aed674225e7cb0b453c343264d34f61430f975d161a3f5e0f24b78dbb1a5d7"} Apr 23 13:42:27.875007 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:27.874973 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/seaweedfs-86cc847c5c-nxr6l" event={"ID":"2f643ad5-5f62-4432-bb1f-e8cdee3a5040","Type":"ContainerStarted","Data":"fafd0e72e379fcfa1320c7257f887531889888c741015100f3ee10bfeaeaaa9b"} Apr 23 13:42:27.875371 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:27.875115 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:42:27.891970 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:27.891928 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/seaweedfs-86cc847c5c-nxr6l" podStartSLOduration=1.47426316 podStartE2EDuration="3.891915465s" podCreationTimestamp="2026-04-23 13:42:24 +0000 UTC" firstStartedPulling="2026-04-23 13:42:24.584406673 +0000 UTC m=+665.332408606" lastFinishedPulling="2026-04-23 13:42:27.002058978 +0000 UTC m=+667.750060911" observedRunningTime="2026-04-23 13:42:27.89016368 +0000 UTC m=+668.638165633" watchObservedRunningTime="2026-04-23 13:42:27.891915465 +0000 UTC m=+668.639917430" Apr 23 13:42:33.879832 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:42:33.879803 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/seaweedfs-86cc847c5c-nxr6l" Apr 23 13:43:35.391943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.391910 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/odh-model-controller-696fc77849-7dz84"] Apr 23 13:43:35.395469 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.395445 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:35.398111 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.398096 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-dockercfg-rxrsn\"" Apr 23 13:43:35.398475 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.398460 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve\"/\"odh-model-controller-webhook-cert\"" Apr 23 13:43:35.406060 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.406035 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7dz84"] Apr 23 13:43:35.471281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.471248 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9584647-a6a8-48b8-8e90-ef64387c7b72-cert\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:35.471281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.471282 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r2vv\" (UniqueName: \"kubernetes.io/projected/e9584647-a6a8-48b8-8e90-ef64387c7b72-kube-api-access-7r2vv\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:35.572353 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.572308 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9584647-a6a8-48b8-8e90-ef64387c7b72-cert\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:35.572506 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.572433 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r2vv\" (UniqueName: \"kubernetes.io/projected/e9584647-a6a8-48b8-8e90-ef64387c7b72-kube-api-access-7r2vv\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:35.572506 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:43:35.572436 2576 secret.go:189] Couldn't get secret kserve/odh-model-controller-webhook-cert: secret "odh-model-controller-webhook-cert" not found Apr 23 13:43:35.572612 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:43:35.572550 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/e9584647-a6a8-48b8-8e90-ef64387c7b72-cert podName:e9584647-a6a8-48b8-8e90-ef64387c7b72 nodeName:}" failed. No retries permitted until 2026-04-23 13:43:36.072532855 +0000 UTC m=+736.820534788 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "cert" (UniqueName: "kubernetes.io/secret/e9584647-a6a8-48b8-8e90-ef64387c7b72-cert") pod "odh-model-controller-696fc77849-7dz84" (UID: "e9584647-a6a8-48b8-8e90-ef64387c7b72") : secret "odh-model-controller-webhook-cert" not found Apr 23 13:43:35.584936 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:35.584915 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r2vv\" (UniqueName: \"kubernetes.io/projected/e9584647-a6a8-48b8-8e90-ef64387c7b72-kube-api-access-7r2vv\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:36.075495 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:36.075461 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9584647-a6a8-48b8-8e90-ef64387c7b72-cert\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:36.077801 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:36.077780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"cert\" (UniqueName: \"kubernetes.io/secret/e9584647-a6a8-48b8-8e90-ef64387c7b72-cert\") pod \"odh-model-controller-696fc77849-7dz84\" (UID: \"e9584647-a6a8-48b8-8e90-ef64387c7b72\") " pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:36.307290 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:36.307256 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:36.424429 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:36.424395 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/odh-model-controller-696fc77849-7dz84"] Apr 23 13:43:36.427752 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:43:36.427725 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pode9584647_a6a8_48b8_8e90_ef64387c7b72.slice/crio-815ea43388cc41de36ad66f7bf987730755e6495cad3edfac7e39936b77dfc23 WatchSource:0}: Error finding container 815ea43388cc41de36ad66f7bf987730755e6495cad3edfac7e39936b77dfc23: Status 404 returned error can't find the container with id 815ea43388cc41de36ad66f7bf987730755e6495cad3edfac7e39936b77dfc23 Apr 23 13:43:37.095161 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:37.095121 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7dz84" event={"ID":"e9584647-a6a8-48b8-8e90-ef64387c7b72","Type":"ContainerStarted","Data":"815ea43388cc41de36ad66f7bf987730755e6495cad3edfac7e39936b77dfc23"} Apr 23 13:43:39.103672 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:39.103629 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/odh-model-controller-696fc77849-7dz84" event={"ID":"e9584647-a6a8-48b8-8e90-ef64387c7b72","Type":"ContainerStarted","Data":"118699b89db6602582f9de621e88553288f2cd6f5cd703c45086d11c61805855"} Apr 23 13:43:39.104134 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:39.103733 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:39.120409 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:39.120362 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/odh-model-controller-696fc77849-7dz84" podStartSLOduration=1.775900575 podStartE2EDuration="4.12032163s" podCreationTimestamp="2026-04-23 13:43:35 +0000 UTC" firstStartedPulling="2026-04-23 13:43:36.429293488 +0000 UTC m=+737.177295424" lastFinishedPulling="2026-04-23 13:43:38.773714542 +0000 UTC m=+739.521716479" observedRunningTime="2026-04-23 13:43:39.119786646 +0000 UTC m=+739.867788602" watchObservedRunningTime="2026-04-23 13:43:39.12032163 +0000 UTC m=+739.868323585" Apr 23 13:43:50.109938 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:50.109906 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve/odh-model-controller-696fc77849-7dz84" Apr 23 13:43:51.051013 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.050976 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve/s3-init-ljq5v"] Apr 23 13:43:51.054395 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.054377 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-ljq5v" Apr 23 13:43:51.063070 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.063045 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-ljq5v"] Apr 23 13:43:51.113662 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.113625 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-6k8bv\" (UniqueName: \"kubernetes.io/projected/d32521c3-bfb8-4bef-82ea-fa15b572a3a7-kube-api-access-6k8bv\") pod \"s3-init-ljq5v\" (UID: \"d32521c3-bfb8-4bef-82ea-fa15b572a3a7\") " pod="kserve/s3-init-ljq5v" Apr 23 13:43:51.214456 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.214401 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-6k8bv\" (UniqueName: \"kubernetes.io/projected/d32521c3-bfb8-4bef-82ea-fa15b572a3a7-kube-api-access-6k8bv\") pod \"s3-init-ljq5v\" (UID: \"d32521c3-bfb8-4bef-82ea-fa15b572a3a7\") " pod="kserve/s3-init-ljq5v" Apr 23 13:43:51.223233 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.223205 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-6k8bv\" (UniqueName: \"kubernetes.io/projected/d32521c3-bfb8-4bef-82ea-fa15b572a3a7-kube-api-access-6k8bv\") pod \"s3-init-ljq5v\" (UID: \"d32521c3-bfb8-4bef-82ea-fa15b572a3a7\") " pod="kserve/s3-init-ljq5v" Apr 23 13:43:51.363420 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.363299 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-ljq5v" Apr 23 13:43:51.477043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:51.477014 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve/s3-init-ljq5v"] Apr 23 13:43:51.479861 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:43:51.479833 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-podd32521c3_bfb8_4bef_82ea_fa15b572a3a7.slice/crio-4ee82bf1ebd0469a64d48a7923d304da0c3884743d995575cab52122a02f47b4 WatchSource:0}: Error finding container 4ee82bf1ebd0469a64d48a7923d304da0c3884743d995575cab52122a02f47b4: Status 404 returned error can't find the container with id 4ee82bf1ebd0469a64d48a7923d304da0c3884743d995575cab52122a02f47b4 Apr 23 13:43:52.147524 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:52.147474 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-ljq5v" event={"ID":"d32521c3-bfb8-4bef-82ea-fa15b572a3a7","Type":"ContainerStarted","Data":"4ee82bf1ebd0469a64d48a7923d304da0c3884743d995575cab52122a02f47b4"} Apr 23 13:43:56.166703 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:56.166664 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-ljq5v" event={"ID":"d32521c3-bfb8-4bef-82ea-fa15b572a3a7","Type":"ContainerStarted","Data":"588b3fd160749407210278a50f8660e568a017c5b41dfa172afc723129cefc34"} Apr 23 13:43:56.184576 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:56.184534 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve/s3-init-ljq5v" podStartSLOduration=0.948289151 podStartE2EDuration="5.184520419s" podCreationTimestamp="2026-04-23 13:43:51 +0000 UTC" firstStartedPulling="2026-04-23 13:43:51.48151408 +0000 UTC m=+752.229516013" lastFinishedPulling="2026-04-23 13:43:55.717745346 +0000 UTC m=+756.465747281" observedRunningTime="2026-04-23 13:43:56.182705592 +0000 UTC m=+756.930707548" watchObservedRunningTime="2026-04-23 13:43:56.184520419 +0000 UTC m=+756.932522373" Apr 23 13:43:59.179841 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:59.179807 2576 generic.go:358] "Generic (PLEG): container finished" podID="d32521c3-bfb8-4bef-82ea-fa15b572a3a7" containerID="588b3fd160749407210278a50f8660e568a017c5b41dfa172afc723129cefc34" exitCode=0 Apr 23 13:43:59.180208 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:43:59.179873 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-ljq5v" event={"ID":"d32521c3-bfb8-4bef-82ea-fa15b572a3a7","Type":"ContainerDied","Data":"588b3fd160749407210278a50f8660e568a017c5b41dfa172afc723129cefc34"} Apr 23 13:44:00.307791 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:00.307770 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-ljq5v" Apr 23 13:44:00.400833 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:00.400801 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-6k8bv\" (UniqueName: \"kubernetes.io/projected/d32521c3-bfb8-4bef-82ea-fa15b572a3a7-kube-api-access-6k8bv\") pod \"d32521c3-bfb8-4bef-82ea-fa15b572a3a7\" (UID: \"d32521c3-bfb8-4bef-82ea-fa15b572a3a7\") " Apr 23 13:44:00.402874 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:00.402849 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d32521c3-bfb8-4bef-82ea-fa15b572a3a7-kube-api-access-6k8bv" (OuterVolumeSpecName: "kube-api-access-6k8bv") pod "d32521c3-bfb8-4bef-82ea-fa15b572a3a7" (UID: "d32521c3-bfb8-4bef-82ea-fa15b572a3a7"). InnerVolumeSpecName "kube-api-access-6k8bv". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:44:00.501795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:00.501692 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-6k8bv\" (UniqueName: \"kubernetes.io/projected/d32521c3-bfb8-4bef-82ea-fa15b572a3a7-kube-api-access-6k8bv\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:44:01.190522 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:01.190492 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve/s3-init-ljq5v" Apr 23 13:44:01.190684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:01.190487 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve/s3-init-ljq5v" event={"ID":"d32521c3-bfb8-4bef-82ea-fa15b572a3a7","Type":"ContainerDied","Data":"4ee82bf1ebd0469a64d48a7923d304da0c3884743d995575cab52122a02f47b4"} Apr 23 13:44:01.190684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:01.190598 2576 pod_container_deletor.go:80] "Container not found in pod's containers" containerID="4ee82bf1ebd0469a64d48a7923d304da0c3884743d995575cab52122a02f47b4" Apr 23 13:44:20.487998 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.487964 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw"] Apr 23 13:44:20.488439 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.488321 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d32521c3-bfb8-4bef-82ea-fa15b572a3a7" containerName="s3-init" Apr 23 13:44:20.488439 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.488346 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d32521c3-bfb8-4bef-82ea-fa15b572a3a7" containerName="s3-init" Apr 23 13:44:20.488439 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.488397 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d32521c3-bfb8-4bef-82ea-fa15b572a3a7" containerName="s3-init" Apr 23 13:44:20.491303 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.491286 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.495276 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.495231 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:44:20.495276 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.495242 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:44:20.496070 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.496048 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:44:20.496192 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.496148 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-inline-config-test-kserve-self-signed-certs\"" Apr 23 13:44:20.502796 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.502760 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw"] Apr 23 13:44:20.571745 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571718 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tls-certs\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.571883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571764 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-model-cache\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.571883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571798 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-home\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.571883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571818 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-dshm\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.571883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571836 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tmp-dir\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.571883 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571856 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.572054 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.571901 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4vsmb\" (UniqueName: \"kubernetes.io/projected/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kube-api-access-4vsmb\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672468 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672440 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tmp-dir\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672581 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672488 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672581 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672519 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4vsmb\" (UniqueName: \"kubernetes.io/projected/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kube-api-access-4vsmb\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672581 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672572 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tls-certs\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672742 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-model-cache\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672742 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672666 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-home\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672742 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672691 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-dshm\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672910 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672889 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kserve-provision-location\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.672979 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.672958 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-model-cache\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.673052 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.673026 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-home\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.673153 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.673064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tmp-dir\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.675072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.675050 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-dshm\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.675137 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.675082 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tls-certs\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.681558 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.681539 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4vsmb\" (UniqueName: \"kubernetes.io/projected/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kube-api-access-4vsmb\") pod \"scheduler-inline-config-test-kserve-659ddf6566-g2rbw\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.802915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.802886 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:20.929933 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:20.929903 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw"] Apr 23 13:44:20.931276 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:44:20.931248 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podad2e6c34_d7b5_4039_a73f_99dd7a89c23b.slice/crio-42f6e1616d14a901bdac9c254afc3957d40dfe0f04e6fdb93b85c94e871136e9 WatchSource:0}: Error finding container 42f6e1616d14a901bdac9c254afc3957d40dfe0f04e6fdb93b85c94e871136e9: Status 404 returned error can't find the container with id 42f6e1616d14a901bdac9c254afc3957d40dfe0f04e6fdb93b85c94e871136e9 Apr 23 13:44:21.257437 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:21.257354 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" event={"ID":"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b","Type":"ContainerStarted","Data":"42f6e1616d14a901bdac9c254afc3957d40dfe0f04e6fdb93b85c94e871136e9"} Apr 23 13:44:24.271266 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:24.271233 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" event={"ID":"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b","Type":"ContainerStarted","Data":"e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214"} Apr 23 13:44:28.288559 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:28.288522 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerID="e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214" exitCode=0 Apr 23 13:44:28.288936 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:28.288568 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" event={"ID":"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b","Type":"ContainerDied","Data":"e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214"} Apr 23 13:44:30.296115 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:30.296082 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" event={"ID":"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b","Type":"ContainerStarted","Data":"bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd"} Apr 23 13:44:30.315500 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:30.315443 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" podStartSLOduration=1.275419499 podStartE2EDuration="10.315429659s" podCreationTimestamp="2026-04-23 13:44:20 +0000 UTC" firstStartedPulling="2026-04-23 13:44:20.933111896 +0000 UTC m=+781.681113839" lastFinishedPulling="2026-04-23 13:44:29.973122066 +0000 UTC m=+790.721123999" observedRunningTime="2026-04-23 13:44:30.313926927 +0000 UTC m=+791.061928882" watchObservedRunningTime="2026-04-23 13:44:30.315429659 +0000 UTC m=+791.063431613" Apr 23 13:44:30.803278 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:30.803251 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:30.803439 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:30.803292 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:30.815545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:30.815523 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:44:31.310374 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:44:31.310319 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:45:01.380807 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.380777 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff"] Apr 23 13:45:01.403638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.403610 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff"] Apr 23 13:45:01.403794 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.403719 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.406196 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.406172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"gw-sec2774c263d49959f50d9eebc552e13bf9-kserve-self-signed-certs\"" Apr 23 13:45:01.427772 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.427751 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.427898 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.427781 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c4606-ff59-4048-8c14-a19edfa523b2-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.427898 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.427803 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-5skvr\" (UniqueName: \"kubernetes.io/projected/a81c4606-ff59-4048-8c14-a19edfa523b2-kube-api-access-5skvr\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.427898 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.427824 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.428059 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.427919 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.428059 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.428023 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.428136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.428076 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528595 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528569 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528734 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528606 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528734 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528655 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528734 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528684 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c4606-ff59-4048-8c14-a19edfa523b2-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528734 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528710 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-5skvr\" (UniqueName: \"kubernetes.io/projected/a81c4606-ff59-4048-8c14-a19edfa523b2-kube-api-access-5skvr\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528779 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.528915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.528821 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.531064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.529389 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-kserve-provision-location\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.531064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.529006 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-model-cache\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.531064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.529702 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-tmp-dir\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.531064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.529790 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-home\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.536560 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.536536 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-dshm\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.536786 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.536762 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c4606-ff59-4048-8c14-a19edfa523b2-tls-certs\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.538467 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.538443 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-5skvr\" (UniqueName: \"kubernetes.io/projected/a81c4606-ff59-4048-8c14-a19edfa523b2-kube-api-access-5skvr\") pod \"gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.713623 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.713539 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:01.837505 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:01.837486 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff"] Apr 23 13:45:01.839499 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:45:01.839473 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda81c4606_ff59_4048_8c14_a19edfa523b2.slice/crio-83f06b0f632cd53c969aec7759b89fb2cef8df3c63d60c95c5e59180afadda51 WatchSource:0}: Error finding container 83f06b0f632cd53c969aec7759b89fb2cef8df3c63d60c95c5e59180afadda51: Status 404 returned error can't find the container with id 83f06b0f632cd53c969aec7759b89fb2cef8df3c63d60c95c5e59180afadda51 Apr 23 13:45:02.400389 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:02.400351 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" event={"ID":"a81c4606-ff59-4048-8c14-a19edfa523b2","Type":"ContainerStarted","Data":"7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed"} Apr 23 13:45:02.400389 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:02.400392 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" event={"ID":"a81c4606-ff59-4048-8c14-a19edfa523b2","Type":"ContainerStarted","Data":"83f06b0f632cd53c969aec7759b89fb2cef8df3c63d60c95c5e59180afadda51"} Apr 23 13:45:05.277492 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.277455 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff"] Apr 23 13:45:05.278001 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.277725 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" podUID="a81c4606-ff59-4048-8c14-a19edfa523b2" containerName="storage-initializer" containerID="cri-o://7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed" gracePeriod=30 Apr 23 13:45:05.835165 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.835112 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:05.866800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866771 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-5skvr\" (UniqueName: \"kubernetes.io/projected/a81c4606-ff59-4048-8c14-a19edfa523b2-kube-api-access-5skvr\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.866914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866808 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-model-cache\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.866914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866880 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c4606-ff59-4048-8c14-a19edfa523b2-tls-certs\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.866914 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866907 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-dshm\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.867064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866923 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-home\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.867064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866946 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-tmp-dir\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.867064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.866973 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-kserve-provision-location\") pod \"a81c4606-ff59-4048-8c14-a19edfa523b2\" (UID: \"a81c4606-ff59-4048-8c14-a19edfa523b2\") " Apr 23 13:45:05.867208 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.867092 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-model-cache" (OuterVolumeSpecName: "model-cache") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:05.867264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.867206 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-home" (OuterVolumeSpecName: "home") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:05.867321 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.867289 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:05.867321 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.867305 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:05.867485 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.867458 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:05.868958 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.868928 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a81c4606-ff59-4048-8c14-a19edfa523b2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:45:05.869040 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.868968 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a81c4606-ff59-4048-8c14-a19edfa523b2-kube-api-access-5skvr" (OuterVolumeSpecName: "kube-api-access-5skvr") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "kube-api-access-5skvr". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:45:05.869184 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.869166 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-dshm" (OuterVolumeSpecName: "dshm") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:05.930130 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.930087 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a81c4606-ff59-4048-8c14-a19edfa523b2" (UID: "a81c4606-ff59-4048-8c14-a19edfa523b2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:05.968339 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.968294 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-5skvr\" (UniqueName: \"kubernetes.io/projected/a81c4606-ff59-4048-8c14-a19edfa523b2-kube-api-access-5skvr\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:05.968339 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.968321 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a81c4606-ff59-4048-8c14-a19edfa523b2-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:05.968503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.968350 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:05.968503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.968362 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:05.968503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:05.968373 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a81c4606-ff59-4048-8c14-a19edfa523b2-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:06.416772 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.416737 2576 generic.go:358] "Generic (PLEG): container finished" podID="a81c4606-ff59-4048-8c14-a19edfa523b2" containerID="7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed" exitCode=0 Apr 23 13:45:06.417170 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.416802 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" Apr 23 13:45:06.417170 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.416821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" event={"ID":"a81c4606-ff59-4048-8c14-a19edfa523b2","Type":"ContainerDied","Data":"7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed"} Apr 23 13:45:06.417170 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.416859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff" event={"ID":"a81c4606-ff59-4048-8c14-a19edfa523b2","Type":"ContainerDied","Data":"83f06b0f632cd53c969aec7759b89fb2cef8df3c63d60c95c5e59180afadda51"} Apr 23 13:45:06.417170 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.416874 2576 scope.go:117] "RemoveContainer" containerID="7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed" Apr 23 13:45:06.453532 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.453506 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff"] Apr 23 13:45:06.456675 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.456651 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/gw-section-name-router-with-gat-f1d92d0f-kserve-7764d4c47dp4xff"] Apr 23 13:45:06.486422 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.486403 2576 scope.go:117] "RemoveContainer" containerID="7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed" Apr 23 13:45:06.486687 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:45:06.486668 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed\": container with ID starting with 7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed not found: ID does not exist" containerID="7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed" Apr 23 13:45:06.486748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:06.486699 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed"} err="failed to get container status \"7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed\": rpc error: code = NotFound desc = could not find container \"7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed\": container with ID starting with 7e6c91ff6693d036658fd5a38ccfa6b611e56533ba2560b15f4e41cd177459ed not found: ID does not exist" Apr 23 13:45:07.825852 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:07.825818 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a81c4606-ff59-4048-8c14-a19edfa523b2" path="/var/lib/kubelet/pods/a81c4606-ff59-4048-8c14-a19edfa523b2/volumes" Apr 23 13:45:10.791749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.791708 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x"] Apr 23 13:45:10.792527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.792502 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a81c4606-ff59-4048-8c14-a19edfa523b2" containerName="storage-initializer" Apr 23 13:45:10.792527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.792530 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a81c4606-ff59-4048-8c14-a19edfa523b2" containerName="storage-initializer" Apr 23 13:45:10.792694 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.792680 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a81c4606-ff59-4048-8c14-a19edfa523b2" containerName="storage-initializer" Apr 23 13:45:10.796869 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.796846 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.798180 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.798154 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x"] Apr 23 13:45:10.799374 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.799352 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvad71fa5348b85aebd404221bba611457-kserve-self-signed-certs\"" Apr 23 13:45:10.910788 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.910763 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-4ql7w\" (UniqueName: \"kubernetes.io/projected/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kube-api-access-4ql7w\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.910889 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.910793 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.910889 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.910811 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.910981 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.910882 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.910981 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.910931 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.910981 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.910950 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:10.911084 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:10.911040 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011580 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011554 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011587 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-4ql7w\" (UniqueName: \"kubernetes.io/projected/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kube-api-access-4ql7w\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011604 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011726 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011671 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011928 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011727 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011928 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011749 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.011928 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.011914 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.012066 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.012046 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-home\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.012172 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.012150 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.012234 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.012206 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.013872 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.013850 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-dshm\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.014009 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.013993 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.018956 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.018932 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-4ql7w\" (UniqueName: \"kubernetes.io/projected/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kube-api-access-4ql7w\") pod \"llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.107237 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.107183 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:11.237531 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.237505 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x"] Apr 23 13:45:11.239612 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:45:11.239589 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd1fcc84e_46d3_471c_86fc_3b27edfebcdd.slice/crio-e30fe68ee3cc9c2ca705df02625c3390272b1ffa205530f2921f062193b5fb0c WatchSource:0}: Error finding container e30fe68ee3cc9c2ca705df02625c3390272b1ffa205530f2921f062193b5fb0c: Status 404 returned error can't find the container with id e30fe68ee3cc9c2ca705df02625c3390272b1ffa205530f2921f062193b5fb0c Apr 23 13:45:11.435444 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.435359 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" event={"ID":"d1fcc84e-46d3-471c-86fc-3b27edfebcdd","Type":"ContainerStarted","Data":"42f1677208b47f9005d686b58140db2d8cedad34591309c585b8bc0756147f0b"} Apr 23 13:45:11.435444 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:11.435396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" event={"ID":"d1fcc84e-46d3-471c-86fc-3b27edfebcdd","Type":"ContainerStarted","Data":"e30fe68ee3cc9c2ca705df02625c3390272b1ffa205530f2921f062193b5fb0c"} Apr 23 13:45:15.452595 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:15.452560 2576 generic.go:358] "Generic (PLEG): container finished" podID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerID="42f1677208b47f9005d686b58140db2d8cedad34591309c585b8bc0756147f0b" exitCode=0 Apr 23 13:45:15.452952 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:15.452616 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" event={"ID":"d1fcc84e-46d3-471c-86fc-3b27edfebcdd","Type":"ContainerDied","Data":"42f1677208b47f9005d686b58140db2d8cedad34591309c585b8bc0756147f0b"} Apr 23 13:45:16.457495 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:16.457458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" event={"ID":"d1fcc84e-46d3-471c-86fc-3b27edfebcdd","Type":"ContainerStarted","Data":"9a48d4e1ae92e38baf3dc5ea01be6f0233e227623c9bd8a1b18f3ac08b7f24bb"} Apr 23 13:45:16.479589 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:16.479537 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" podStartSLOduration=6.479522368 podStartE2EDuration="6.479522368s" podCreationTimestamp="2026-04-23 13:45:10 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:45:16.476109529 +0000 UTC m=+837.224111483" watchObservedRunningTime="2026-04-23 13:45:16.479522368 +0000 UTC m=+837.227524356" Apr 23 13:45:21.107711 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:21.107675 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:21.108077 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:21.107817 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:21.119669 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:21.119648 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:21.484081 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:21.484000 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:31.650167 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:31.650133 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw"] Apr 23 13:45:31.650809 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:31.650427 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerName="main" containerID="cri-o://bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd" gracePeriod=30 Apr 23 13:45:31.907986 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:31.907934 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:45:32.001272 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001216 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-model-cache\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001301 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kserve-provision-location\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001370 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-dshm\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001395 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-home\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001451 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tmp-dir\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001494 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tls-certs\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001522 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4vsmb\" (UniqueName: \"kubernetes.io/projected/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kube-api-access-4vsmb\") pod \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\" (UID: \"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b\") " Apr 23 13:45:32.001678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001520 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-model-cache" (OuterVolumeSpecName: "model-cache") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:32.001822 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.001772 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.002169 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.002119 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:32.002169 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.002130 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-home" (OuterVolumeSpecName: "home") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:32.004020 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.003994 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-dshm" (OuterVolumeSpecName: "dshm") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:32.004395 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.004370 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kube-api-access-4vsmb" (OuterVolumeSpecName: "kube-api-access-4vsmb") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "kube-api-access-4vsmb". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:45:32.004486 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.004384 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:45:32.053448 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.053414 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" (UID: "ad2e6c34-d7b5-4039-a73f-99dd7a89c23b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:32.102321 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.102290 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.102321 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.102321 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.102470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.102350 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.102470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.102368 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.102470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.102384 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4vsmb\" (UniqueName: \"kubernetes.io/projected/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kube-api-access-4vsmb\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.102470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.102402 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:32.508487 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.508452 2576 generic.go:358] "Generic (PLEG): container finished" podID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerID="bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd" exitCode=0 Apr 23 13:45:32.508645 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.508525 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" Apr 23 13:45:32.508645 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.508534 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" event={"ID":"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b","Type":"ContainerDied","Data":"bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd"} Apr 23 13:45:32.508645 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.508579 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw" event={"ID":"ad2e6c34-d7b5-4039-a73f-99dd7a89c23b","Type":"ContainerDied","Data":"42f6e1616d14a901bdac9c254afc3957d40dfe0f04e6fdb93b85c94e871136e9"} Apr 23 13:45:32.508645 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.508596 2576 scope.go:117] "RemoveContainer" containerID="bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd" Apr 23 13:45:32.517119 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.517103 2576 scope.go:117] "RemoveContainer" containerID="e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214" Apr 23 13:45:32.528733 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.528718 2576 scope.go:117] "RemoveContainer" containerID="bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd" Apr 23 13:45:32.528945 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:45:32.528930 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd\": container with ID starting with bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd not found: ID does not exist" containerID="bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd" Apr 23 13:45:32.528990 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.528952 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd"} err="failed to get container status \"bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd\": rpc error: code = NotFound desc = could not find container \"bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd\": container with ID starting with bacd0de566b73a6c05afdd792f345150aa09e5c99bdbeec80064edb0eeb333cd not found: ID does not exist" Apr 23 13:45:32.528990 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.528967 2576 scope.go:117] "RemoveContainer" containerID="e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214" Apr 23 13:45:32.529182 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:45:32.529168 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214\": container with ID starting with e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214 not found: ID does not exist" containerID="e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214" Apr 23 13:45:32.529228 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.529183 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214"} err="failed to get container status \"e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214\": rpc error: code = NotFound desc = could not find container \"e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214\": container with ID starting with e3160ac225085815d43b6bd6ae3a6773ceceb2005e3d869be32d1c8e24adb214 not found: ID does not exist" Apr 23 13:45:32.533190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.533170 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw"] Apr 23 13:45:32.540301 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:32.540281 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-inline-config-test-kserve-659ddf6566-g2rbw"] Apr 23 13:45:33.825867 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:33.825833 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" path="/var/lib/kubelet/pods/ad2e6c34-d7b5-4039-a73f-99dd7a89c23b/volumes" Apr 23 13:45:42.377590 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.377551 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x"] Apr 23 13:45:42.378117 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.377892 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerName="main" containerID="cri-o://9a48d4e1ae92e38baf3dc5ea01be6f0233e227623c9bd8a1b18f3ac08b7f24bb" gracePeriod=30 Apr 23 13:45:42.545198 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.545170 2576 generic.go:358] "Generic (PLEG): container finished" podID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerID="9a48d4e1ae92e38baf3dc5ea01be6f0233e227623c9bd8a1b18f3ac08b7f24bb" exitCode=0 Apr 23 13:45:42.545312 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.545239 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" event={"ID":"d1fcc84e-46d3-471c-86fc-3b27edfebcdd","Type":"ContainerDied","Data":"9a48d4e1ae92e38baf3dc5ea01be6f0233e227623c9bd8a1b18f3ac08b7f24bb"} Apr 23 13:45:42.637802 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.637751 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:42.799443 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799418 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-4ql7w\" (UniqueName: \"kubernetes.io/projected/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kube-api-access-4ql7w\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799449 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-dshm\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799509 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kserve-provision-location\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799524 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-model-cache\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799546 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-home\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799593 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799563 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tmp-dir\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799603 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tls-certs\") pod \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\" (UID: \"d1fcc84e-46d3-471c-86fc-3b27edfebcdd\") " Apr 23 13:45:42.799886 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799832 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-home" (OuterVolumeSpecName: "home") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:42.799886 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.799851 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-model-cache" (OuterVolumeSpecName: "model-cache") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:42.800089 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.800064 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:42.802177 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.802140 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-dshm" (OuterVolumeSpecName: "dshm") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:42.802306 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.802224 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:45:42.802306 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.802223 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kube-api-access-4ql7w" (OuterVolumeSpecName: "kube-api-access-4ql7w") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "kube-api-access-4ql7w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:45:42.854370 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.854320 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d1fcc84e-46d3-471c-86fc-3b27edfebcdd" (UID: "d1fcc84e-46d3-471c-86fc-3b27edfebcdd"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:45:42.901141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901083 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:42.901141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901105 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:42.901141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901115 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:42.901141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901124 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:42.901141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901132 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:42.901141 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901140 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-4ql7w\" (UniqueName: \"kubernetes.io/projected/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-kube-api-access-4ql7w\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:42.901413 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:42.901148 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d1fcc84e-46d3-471c-86fc-3b27edfebcdd-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:45:43.549716 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.549680 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" event={"ID":"d1fcc84e-46d3-471c-86fc-3b27edfebcdd","Type":"ContainerDied","Data":"e30fe68ee3cc9c2ca705df02625c3390272b1ffa205530f2921f062193b5fb0c"} Apr 23 13:45:43.549716 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.549713 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x" Apr 23 13:45:43.549716 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.549722 2576 scope.go:117] "RemoveContainer" containerID="9a48d4e1ae92e38baf3dc5ea01be6f0233e227623c9bd8a1b18f3ac08b7f24bb" Apr 23 13:45:43.558314 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.558297 2576 scope.go:117] "RemoveContainer" containerID="42f1677208b47f9005d686b58140db2d8cedad34591309c585b8bc0756147f0b" Apr 23 13:45:43.570970 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.570947 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x"] Apr 23 13:45:43.574279 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.574257 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-6fa1027a-kserve-64446679b6dr86x"] Apr 23 13:45:43.825166 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:43.825091 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" path="/var/lib/kubelet/pods/d1fcc84e-46d3-471c-86fc-3b27edfebcdd/volumes" Apr 23 13:45:50.267701 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.267666 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb"] Apr 23 13:45:50.268078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268024 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerName="storage-initializer" Apr 23 13:45:50.268078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268036 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerName="storage-initializer" Apr 23 13:45:50.268078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268051 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerName="main" Apr 23 13:45:50.268078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268057 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerName="main" Apr 23 13:45:50.268078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268066 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerName="main" Apr 23 13:45:50.268078 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268074 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerName="main" Apr 23 13:45:50.268281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268083 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerName="storage-initializer" Apr 23 13:45:50.268281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268089 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerName="storage-initializer" Apr 23 13:45:50.268281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268141 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d1fcc84e-46d3-471c-86fc-3b27edfebcdd" containerName="main" Apr 23 13:45:50.268281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.268153 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="ad2e6c34-d7b5-4039-a73f-99dd7a89c23b" containerName="main" Apr 23 13:45:50.271389 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.271370 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.274238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.274215 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"openshift-service-ca.crt\"" Apr 23 13:45:50.274238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.274231 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"kserve-ci-e2e-test\"/\"kube-root-ca.crt\"" Apr 23 13:45:50.274432 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.274263 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvdde380eaa9fe1facad32d45131f9e34d-kserve-self-signed-certs\"" Apr 23 13:45:50.274432 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.274283 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"default-dockercfg-z29bv\"" Apr 23 13:45:50.283404 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.283385 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb"] Apr 23 13:45:50.361717 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361691 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.361717 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361719 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.361894 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361796 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.361894 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361830 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-v798q\" (UniqueName: \"kubernetes.io/projected/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kube-api-access-v798q\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.361894 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361863 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.362001 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361922 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.362001 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.361955 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463132 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463172 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-v798q\" (UniqueName: \"kubernetes.io/projected/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kube-api-access-v798q\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463201 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463229 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463250 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463274 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463297 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463573 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463664 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463631 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463752 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463681 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.463752 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.463700 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-home\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.465565 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.465543 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-dshm\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.465789 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.465775 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.471514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.471495 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-v798q\" (UniqueName: \"kubernetes.io/projected/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kube-api-access-v798q\") pod \"llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.580614 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.580583 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:45:50.725649 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:50.725614 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb"] Apr 23 13:45:50.735250 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:45:50.735226 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod41df3cb8_d2a1_4fee_ad69_c580f942ab3b.slice/crio-41c7c4ffa74b5632d5b517431b13c68fb6a2197f2a5c2816cea271043c620da4 WatchSource:0}: Error finding container 41c7c4ffa74b5632d5b517431b13c68fb6a2197f2a5c2816cea271043c620da4: Status 404 returned error can't find the container with id 41c7c4ffa74b5632d5b517431b13c68fb6a2197f2a5c2816cea271043c620da4 Apr 23 13:45:51.578904 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:51.578857 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" event={"ID":"41df3cb8-d2a1-4fee-ad69-c580f942ab3b","Type":"ContainerStarted","Data":"584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0"} Apr 23 13:45:51.579557 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:51.578904 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" event={"ID":"41df3cb8-d2a1-4fee-ad69-c580f942ab3b","Type":"ContainerStarted","Data":"41c7c4ffa74b5632d5b517431b13c68fb6a2197f2a5c2816cea271043c620da4"} Apr 23 13:45:54.591087 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:54.591053 2576 generic.go:358] "Generic (PLEG): container finished" podID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerID="584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0" exitCode=0 Apr 23 13:45:54.591402 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:45:54.591130 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" event={"ID":"41df3cb8-d2a1-4fee-ad69-c580f942ab3b","Type":"ContainerDied","Data":"584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0"} Apr 23 13:46:19.753457 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:19.753432 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:46:19.754216 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:19.754198 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:46:39.063825 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.063790 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54"] Apr 23 13:46:39.069156 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.069135 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.071849 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.071826 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-epp-sa-dockercfg-jw9ff\"" Apr 23 13:46:39.072539 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.072522 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"scheduler-ha-replicas-test-kserve-self-signed-certs\"" Apr 23 13:46:39.079826 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.079804 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54"] Apr 23 13:46:39.117129 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.117089 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.117301 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.117146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.117301 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.117215 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.117443 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.117309 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-49jnp\" (UniqueName: \"kubernetes.io/projected/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kube-api-access-49jnp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.117443 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.117388 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.117538 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.117446 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218532 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218495 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218543 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218583 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218631 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-49jnp\" (UniqueName: \"kubernetes.io/projected/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kube-api-access-49jnp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218665 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218926 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218706 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.218982 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218940 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-cache\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.219039 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.218975 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kserve-provision-location\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.219039 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.219019 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-tmp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.219185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.219164 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-uds\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.222763 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.222739 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.227127 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.227102 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-49jnp\" (UniqueName: \"kubernetes.io/projected/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kube-api-access-49jnp\") pod \"scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.380749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.380661 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:46:39.879414 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:39.879383 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54"] Apr 23 13:46:39.882393 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:46:39.882363 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa2e9d4_c91d_4694_b91c_8f4a80758abc.slice/crio-782ed57f79dbdf9ebf78ba610b7ead9a9fe0712b136dbe745d4521578466faa1 WatchSource:0}: Error finding container 782ed57f79dbdf9ebf78ba610b7ead9a9fe0712b136dbe745d4521578466faa1: Status 404 returned error can't find the container with id 782ed57f79dbdf9ebf78ba610b7ead9a9fe0712b136dbe745d4521578466faa1 Apr 23 13:46:40.759499 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:40.759459 2576 generic.go:358] "Generic (PLEG): container finished" podID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerID="a8a2b089be35a21880cb9fc1dbc2cf9abeeb05bba78d6d5772bcdbc56d28f580" exitCode=0 Apr 23 13:46:40.759962 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:40.759544 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerDied","Data":"a8a2b089be35a21880cb9fc1dbc2cf9abeeb05bba78d6d5772bcdbc56d28f580"} Apr 23 13:46:40.759962 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:40.759587 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerStarted","Data":"782ed57f79dbdf9ebf78ba610b7ead9a9fe0712b136dbe745d4521578466faa1"} Apr 23 13:46:40.761389 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:40.761356 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" event={"ID":"41df3cb8-d2a1-4fee-ad69-c580f942ab3b","Type":"ContainerStarted","Data":"d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563"} Apr 23 13:46:40.797241 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:40.797196 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podStartSLOduration=5.175721535 podStartE2EDuration="50.797183383s" podCreationTimestamp="2026-04-23 13:45:50 +0000 UTC" firstStartedPulling="2026-04-23 13:45:54.592259531 +0000 UTC m=+875.340261473" lastFinishedPulling="2026-04-23 13:46:40.21372137 +0000 UTC m=+920.961723321" observedRunningTime="2026-04-23 13:46:40.794908592 +0000 UTC m=+921.542910546" watchObservedRunningTime="2026-04-23 13:46:40.797183383 +0000 UTC m=+921.545185337" Apr 23 13:46:42.771799 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:42.771748 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerStarted","Data":"bcff9d1b84f37ff29e773f7568993e4fae30283d4165999576e704b4ce8e0d71"} Apr 23 13:46:50.580984 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:50.580928 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:46:50.580984 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:50.580979 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:46:50.582732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:46:50.582683 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:47:00.581270 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:00.581228 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:47:10.581897 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:10.581843 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:47:13.061839 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:13.061798 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54"] Apr 23 13:47:13.897588 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:13.897543 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerStarted","Data":"ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016"} Apr 23 13:47:13.897794 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:13.897747 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:47:13.897794 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:13.897747 2576 kubelet_pods.go:1019] "Unable to retrieve pull secret, the image pull may not succeed." pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" secret="" err="secret \"scheduler-ha-replicas-test-epp-sa-dockercfg-jw9ff\" not found" Apr 23 13:47:13.900410 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:13.900387 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 13:47:13.920323 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:13.920272 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podStartSLOduration=2.812176571 podStartE2EDuration="34.920260591s" podCreationTimestamp="2026-04-23 13:46:39 +0000 UTC" firstStartedPulling="2026-04-23 13:46:40.76092557 +0000 UTC m=+921.508927518" lastFinishedPulling="2026-04-23 13:47:12.869009589 +0000 UTC m=+953.617011538" observedRunningTime="2026-04-23 13:47:13.917581551 +0000 UTC m=+954.665583508" watchObservedRunningTime="2026-04-23 13:47:13.920260591 +0000 UTC m=+954.668262578" Apr 23 13:47:13.971855 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:13.971817 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:13.972016 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:13.971894 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs podName:2aa2e9d4-c91d-4694-b91c-8f4a80758abc nodeName:}" failed. No retries permitted until 2026-04-23 13:47:14.471874087 +0000 UTC m=+955.219876021 (durationBeforeRetry 500ms). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:14.475322 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:14.475292 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:14.475721 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:14.475388 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs podName:2aa2e9d4-c91d-4694-b91c-8f4a80758abc nodeName:}" failed. No retries permitted until 2026-04-23 13:47:15.475368861 +0000 UTC m=+956.223370808 (durationBeforeRetry 1s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:14.901918 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:14.901876 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" containerID="cri-o://bcff9d1b84f37ff29e773f7568993e4fae30283d4165999576e704b4ce8e0d71" gracePeriod=30 Apr 23 13:47:14.902134 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:14.901943 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="tokenizer" containerID="cri-o://ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016" gracePeriod=30 Apr 23 13:47:14.903702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:14.903659 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" probeResult="failure" output="service unhealthy (responded with \"NOT_SERVING\")" Apr 23 13:47:15.483802 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:15.483769 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:15.484184 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:15.483853 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs podName:2aa2e9d4-c91d-4694-b91c-8f4a80758abc nodeName:}" failed. No retries permitted until 2026-04-23 13:47:17.483837823 +0000 UTC m=+958.231839756 (durationBeforeRetry 2s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:15.908155 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:15.908122 2576 generic.go:358] "Generic (PLEG): container finished" podID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerID="bcff9d1b84f37ff29e773f7568993e4fae30283d4165999576e704b4ce8e0d71" exitCode=0 Apr 23 13:47:15.908421 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:15.908169 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerDied","Data":"bcff9d1b84f37ff29e773f7568993e4fae30283d4165999576e704b4ce8e0d71"} Apr 23 13:47:17.504488 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:17.504458 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:17.504885 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:17.504529 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs podName:2aa2e9d4-c91d-4694-b91c-8f4a80758abc nodeName:}" failed. No retries permitted until 2026-04-23 13:47:21.504515315 +0000 UTC m=+962.252517248 (durationBeforeRetry 4s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:19.381787 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:19.381753 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:47:20.581272 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:20.581221 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:47:21.543310 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:21.543277 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:21.543506 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:21.543375 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs podName:2aa2e9d4-c91d-4694-b91c-8f4a80758abc nodeName:}" failed. No retries permitted until 2026-04-23 13:47:29.54335736 +0000 UTC m=+970.291359299 (durationBeforeRetry 8s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:24.903170 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:47:24.903143 2576 logging.go:55] [core] [Channel #23 SubChannel #24]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.43:9003", ServerName: "10.134.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.43:9003: connect: connection refused" Apr 23 13:47:25.902900 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:25.902856 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.43:9003\" within 1s: context deadline exceeded" Apr 23 13:47:29.622852 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:29.622821 2576 secret.go:189] Couldn't get secret kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-self-signed-certs: secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:29.623236 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:29.622885 2576 nestedpendingoperations.go:348] Operation for "{volumeName:kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs podName:2aa2e9d4-c91d-4694-b91c-8f4a80758abc nodeName:}" failed. No retries permitted until 2026-04-23 13:47:45.622870387 +0000 UTC m=+986.370872320 (durationBeforeRetry 16s). Error: MountVolume.SetUp failed for volume "tls-certs" (UniqueName: "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs") pod "scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc") : secret "scheduler-ha-replicas-test-kserve-self-signed-certs" not found Apr 23 13:47:30.581514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:30.581478 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:47:34.903100 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:47:34.903071 2576 logging.go:55] [core] [Channel #25 SubChannel #26]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.43:9003", ServerName: "10.134.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.43:9003: connect: connection refused" Apr 23 13:47:35.903446 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:35.903402 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.43:9003\" within 1s: context deadline exceeded" Apr 23 13:47:40.581611 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:40.581570 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:47:44.903284 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:47:44.903253 2576 logging.go:55] [core] [Channel #27 SubChannel #28]grpc: addrConn.createTransport failed to connect to {Addr: "10.134.0.43:9003", ServerName: "10.134.0.43:9003", }. Err: connection error: desc = "transport: Error while dialing: dial tcp 10.134.0.43:9003: connect: connection refused" Apr 23 13:47:44.976237 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:44.976208 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa2e9d4_c91d_4694_b91c_8f4a80758abc.slice/crio-conmon-ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:47:44.976387 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:47:44.976257 2576 cadvisor_stats_provider.go:525] "Partial failure issuing cadvisor.ContainerInfoV2" err="partial failures: [\"/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod2aa2e9d4_c91d_4694_b91c_8f4a80758abc.slice/crio-conmon-ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016.scope\": RecentStats: unable to find data in memory cache]" Apr 23 13:47:45.007637 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.007615 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54_2aa2e9d4-c91d-4694-b91c-8f4a80758abc/tokenizer/0.log" Apr 23 13:47:45.008233 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.008207 2576 generic.go:358] "Generic (PLEG): container finished" podID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerID="ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016" exitCode=137 Apr 23 13:47:45.008321 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.008276 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerDied","Data":"ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016"} Apr 23 13:47:45.197580 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.197556 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54_2aa2e9d4-c91d-4694-b91c-8f4a80758abc/tokenizer/0.log" Apr 23 13:47:45.198181 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.198165 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:47:45.366958 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.366928 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-uds\") pod \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " Apr 23 13:47:45.367144 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.366991 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-49jnp\" (UniqueName: \"kubernetes.io/projected/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kube-api-access-49jnp\") pod \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " Apr 23 13:47:45.367144 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367060 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-tmp\") pod \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " Apr 23 13:47:45.367144 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367105 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs\") pod \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " Apr 23 13:47:45.367144 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367140 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-cache\") pod \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " Apr 23 13:47:45.367379 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367173 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kserve-provision-location\") pod \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\" (UID: \"2aa2e9d4-c91d-4694-b91c-8f4a80758abc\") " Apr 23 13:47:45.367437 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367407 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-tmp" (OuterVolumeSpecName: "tokenizer-tmp") pod "2aa2e9d4-c91d-4694-b91c-8f4a80758abc" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc"). InnerVolumeSpecName "tokenizer-tmp". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:45.367498 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367426 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-cache" (OuterVolumeSpecName: "tokenizer-cache") pod "2aa2e9d4-c91d-4694-b91c-8f4a80758abc" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc"). InnerVolumeSpecName "tokenizer-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:45.367498 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367437 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-uds" (OuterVolumeSpecName: "tokenizer-uds") pod "2aa2e9d4-c91d-4694-b91c-8f4a80758abc" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc"). InnerVolumeSpecName "tokenizer-uds". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:45.367584 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367493 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-cache\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:47:45.367584 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367515 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-tmp\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-tmp\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:47:45.368014 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.367994 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "2aa2e9d4-c91d-4694-b91c-8f4a80758abc" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:47:45.369540 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.369516 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "2aa2e9d4-c91d-4694-b91c-8f4a80758abc" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:47:45.369650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.369538 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kube-api-access-49jnp" (OuterVolumeSpecName: "kube-api-access-49jnp") pod "2aa2e9d4-c91d-4694-b91c-8f4a80758abc" (UID: "2aa2e9d4-c91d-4694-b91c-8f4a80758abc"). InnerVolumeSpecName "kube-api-access-49jnp". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:47:45.467929 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.467871 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:47:45.467929 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.467894 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:47:45.467929 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.467905 2576 reconciler_common.go:299] "Volume detached for volume \"tokenizer-uds\" (UniqueName: \"kubernetes.io/empty-dir/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-tokenizer-uds\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:47:45.467929 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.467913 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-49jnp\" (UniqueName: \"kubernetes.io/projected/2aa2e9d4-c91d-4694-b91c-8f4a80758abc-kube-api-access-49jnp\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:47:45.903413 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:45.903377 2576 prober.go:120] "Probe failed" probeType="Readiness" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" probeResult="failure" output="timeout: failed to connect service \"10.134.0.43:9003\" within 1s: context deadline exceeded" Apr 23 13:47:46.013173 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.013144 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54_2aa2e9d4-c91d-4694-b91c-8f4a80758abc/tokenizer/0.log" Apr 23 13:47:46.013853 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.013831 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" event={"ID":"2aa2e9d4-c91d-4694-b91c-8f4a80758abc","Type":"ContainerDied","Data":"782ed57f79dbdf9ebf78ba610b7ead9a9fe0712b136dbe745d4521578466faa1"} Apr 23 13:47:46.013971 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.013871 2576 scope.go:117] "RemoveContainer" containerID="ae9d9cb37fff821c04009559232941eb44742d5251f11609d403869248488016" Apr 23 13:47:46.013971 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.013843 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54" Apr 23 13:47:46.023704 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.023688 2576 scope.go:117] "RemoveContainer" containerID="bcff9d1b84f37ff29e773f7568993e4fae30283d4165999576e704b4ce8e0d71" Apr 23 13:47:46.031196 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.031122 2576 scope.go:117] "RemoveContainer" containerID="a8a2b089be35a21880cb9fc1dbc2cf9abeeb05bba78d6d5772bcdbc56d28f580" Apr 23 13:47:46.033056 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.033034 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54"] Apr 23 13:47:46.039297 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:46.039278 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/scheduler-ha-replicas-test-kserve-router-scheduler-79c7868t8b54"] Apr 23 13:47:47.828381 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:47.828319 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" path="/var/lib/kubelet/pods/2aa2e9d4-c91d-4694-b91c-8f4a80758abc/volumes" Apr 23 13:47:50.581293 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:47:50.581245 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:48:00.581930 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:00.581887 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:48:10.581454 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:10.581409 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" probeResult="failure" output="Get \"https://10.134.0.42:8000/health\": dial tcp 10.134.0.42:8000: connect: connection refused" Apr 23 13:48:20.591386 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:20.591354 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:48:20.598795 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:20.598765 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:48:23.070735 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.070700 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn"] Apr 23 13:48:23.071114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071064 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" Apr 23 13:48:23.071114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071076 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" Apr 23 13:48:23.071114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071088 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="storage-initializer" Apr 23 13:48:23.071114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071093 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="storage-initializer" Apr 23 13:48:23.071114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071113 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="tokenizer" Apr 23 13:48:23.071292 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071119 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="tokenizer" Apr 23 13:48:23.071292 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071175 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="tokenizer" Apr 23 13:48:23.071292 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.071185 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="2aa2e9d4-c91d-4694-b91c-8f4a80758abc" containerName="main" Apr 23 13:48:23.075315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.075295 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.078388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.078368 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"conv-test-lora-crit-kserve-self-signed-certs\"" Apr 23 13:48:23.088206 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.088182 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn"] Apr 23 13:48:23.184900 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.184873 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-home\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.185071 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.184913 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-tmp-dir\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.185071 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.184933 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.185071 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.185050 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-model-cache\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.185242 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.185113 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-rlgv6\" (UniqueName: \"kubernetes.io/projected/72829302-9455-4fb2-b433-1f167afaebb7-kube-api-access-rlgv6\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.185242 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.185146 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-dshm\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.185242 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.185174 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72829302-9455-4fb2-b433-1f167afaebb7-tls-certs\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.285743 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-model-cache\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.285743 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285759 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-rlgv6\" (UniqueName: \"kubernetes.io/projected/72829302-9455-4fb2-b433-1f167afaebb7-kube-api-access-rlgv6\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285783 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-dshm\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285800 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72829302-9455-4fb2-b433-1f167afaebb7-tls-certs\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285833 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-home\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285871 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-tmp-dir\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.285897 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286263 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.286167 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-model-cache\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286300 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.286266 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-home\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286370 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.286311 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-tmp-dir\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.286419 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.286411 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-kserve-provision-location\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.287998 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.287977 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-dshm\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.288292 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.288273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72829302-9455-4fb2-b433-1f167afaebb7-tls-certs\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.304068 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.304044 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-rlgv6\" (UniqueName: \"kubernetes.io/projected/72829302-9455-4fb2-b433-1f167afaebb7-kube-api-access-rlgv6\") pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.385162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.385086 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:23.521159 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.521136 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn"] Apr 23 13:48:23.523266 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:48:23.523236 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod72829302_9455_4fb2_b433_1f167afaebb7.slice/crio-1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef WatchSource:0}: Error finding container 1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef: Status 404 returned error can't find the container with id 1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef Apr 23 13:48:23.525186 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:23.525160 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:48:24.150312 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:24.150289 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/0.log" Apr 23 13:48:24.150651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:24.150324 2576 generic.go:358] "Generic (PLEG): container finished" podID="72829302-9455-4fb2-b433-1f167afaebb7" containerID="16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3" exitCode=1 Apr 23 13:48:24.150651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:24.150375 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" event={"ID":"72829302-9455-4fb2-b433-1f167afaebb7","Type":"ContainerDied","Data":"16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3"} Apr 23 13:48:24.150651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:24.150396 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" event={"ID":"72829302-9455-4fb2-b433-1f167afaebb7","Type":"ContainerStarted","Data":"1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef"} Apr 23 13:48:25.157067 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:25.157043 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/1.log" Apr 23 13:48:25.157470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:25.157420 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/0.log" Apr 23 13:48:25.157470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:25.157455 2576 generic.go:358] "Generic (PLEG): container finished" podID="72829302-9455-4fb2-b433-1f167afaebb7" containerID="a54c12916c117492cda2877614598da4b4afee12286f7981f7a48dae4bd758aa" exitCode=1 Apr 23 13:48:25.157589 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:25.157537 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" event={"ID":"72829302-9455-4fb2-b433-1f167afaebb7","Type":"ContainerDied","Data":"a54c12916c117492cda2877614598da4b4afee12286f7981f7a48dae4bd758aa"} Apr 23 13:48:25.157589 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:25.157574 2576 scope.go:117] "RemoveContainer" containerID="16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3" Apr 23 13:48:25.157896 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:25.157879 2576 scope.go:117] "RemoveContainer" containerID="16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3" Apr 23 13:48:25.168084 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:48:25.168058 2576 log.go:32] "RemoveContainer from runtime service failed" err="rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-746cf5776b-dcwfn_kserve-ci-e2e-test_72829302-9455-4fb2-b433-1f167afaebb7_0 in pod sandbox 1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef from index: no such id: '16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3'" containerID="16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3" Apr 23 13:48:25.168150 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:48:25.168101 2576 kuberuntime_container.go:951] "Unhandled Error" err="failed to remove pod init container \"storage-initializer\": rpc error: code = Unknown desc = failed to delete container k8s_storage-initializer_conv-test-lora-crit-kserve-746cf5776b-dcwfn_kserve-ci-e2e-test_72829302-9455-4fb2-b433-1f167afaebb7_0 in pod sandbox 1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef from index: no such id: '16ae7454d345721516238488ca5e97a79d60a866ec53d0e0f9c5a23d73a7dec3'; Skipping pod \"conv-test-lora-crit-kserve-746cf5776b-dcwfn_kserve-ci-e2e-test(72829302-9455-4fb2-b433-1f167afaebb7)\"" logger="UnhandledError" Apr 23 13:48:25.169636 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:48:25.169614 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-746cf5776b-dcwfn_kserve-ci-e2e-test(72829302-9455-4fb2-b433-1f167afaebb7)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" podUID="72829302-9455-4fb2-b433-1f167afaebb7" Apr 23 13:48:26.162388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:26.162364 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/1.log" Apr 23 13:48:26.162945 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:48:26.162925 2576 pod_workers.go:1301] "Error syncing pod, skipping" err="failed to \"StartContainer\" for \"storage-initializer\" with CrashLoopBackOff: \"back-off 10s restarting failed container=storage-initializer pod=conv-test-lora-crit-kserve-746cf5776b-dcwfn_kserve-ci-e2e-test(72829302-9455-4fb2-b433-1f167afaebb7)\"" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" podUID="72829302-9455-4fb2-b433-1f167afaebb7" Apr 23 13:48:37.784220 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.784182 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc"] Apr 23 13:48:37.789591 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.789568 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.792104 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.792083 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 13:48:37.801280 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.801255 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc"] Apr 23 13:48:37.915308 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915271 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-home\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.915308 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915308 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kserve-provision-location\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.915577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915410 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-dshm\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.915577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915441 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tmp-dir\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.915577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915476 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-gbjbc\" (UniqueName: \"kubernetes.io/projected/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kube-api-access-gbjbc\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.915577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915535 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-model-cache\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:37.915577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:37.915564 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tls-certs\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.016970 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-model-cache\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017628 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017607 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tls-certs\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017721 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017405 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-model-cache\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017781 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017736 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-home\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017781 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017769 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kserve-provision-location\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017878 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017826 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-dshm\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017878 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017851 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tmp-dir\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.017981 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.017889 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-gbjbc\" (UniqueName: \"kubernetes.io/projected/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kube-api-access-gbjbc\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.018240 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.018214 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-home\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.018371 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.018211 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tmp-dir\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.018371 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.018323 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kserve-provision-location\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.020061 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.020033 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-dshm\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.020387 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.020368 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tls-certs\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.032163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.032142 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-gbjbc\" (UniqueName: \"kubernetes.io/projected/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kube-api-access-gbjbc\") pod \"stop-feature-test-kserve-76fc9bdd58-xgckc\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.102170 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.102091 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:38.226453 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:38.226423 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc"] Apr 23 13:48:38.229427 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:48:38.229403 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod175b6e6d_865c_455b_b6f1_c0e3f6efe504.slice/crio-5197c04eb488a8a1c5d8276bf31f148ac7a61da7c4e0299a7838b136e3065835 WatchSource:0}: Error finding container 5197c04eb488a8a1c5d8276bf31f148ac7a61da7c4e0299a7838b136e3065835: Status 404 returned error can't find the container with id 5197c04eb488a8a1c5d8276bf31f148ac7a61da7c4e0299a7838b136e3065835 Apr 23 13:48:39.207142 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:39.207099 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" event={"ID":"175b6e6d-865c-455b-b6f1-c0e3f6efe504","Type":"ContainerStarted","Data":"f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257"} Apr 23 13:48:39.207528 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:39.207152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" event={"ID":"175b6e6d-865c-455b-b6f1-c0e3f6efe504","Type":"ContainerStarted","Data":"5197c04eb488a8a1c5d8276bf31f148ac7a61da7c4e0299a7838b136e3065835"} Apr 23 13:48:40.217248 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:40.217165 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/1.log" Apr 23 13:48:40.217672 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:40.217296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" event={"ID":"72829302-9455-4fb2-b433-1f167afaebb7","Type":"ContainerStarted","Data":"f831312da015c91a09f1820e31355882b1f4ca3e0e1bf0b3f089ce28ef0ab773"} Apr 23 13:48:40.579057 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:40.578994 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn"] Apr 23 13:48:41.222122 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.222093 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/2.log" Apr 23 13:48:41.222619 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.222483 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/1.log" Apr 23 13:48:41.222619 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.222515 2576 generic.go:358] "Generic (PLEG): container finished" podID="72829302-9455-4fb2-b433-1f167afaebb7" containerID="f831312da015c91a09f1820e31355882b1f4ca3e0e1bf0b3f089ce28ef0ab773" exitCode=1 Apr 23 13:48:41.222619 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.222561 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" event={"ID":"72829302-9455-4fb2-b433-1f167afaebb7","Type":"ContainerDied","Data":"f831312da015c91a09f1820e31355882b1f4ca3e0e1bf0b3f089ce28ef0ab773"} Apr 23 13:48:41.222619 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.222592 2576 scope.go:117] "RemoveContainer" containerID="a54c12916c117492cda2877614598da4b4afee12286f7981f7a48dae4bd758aa" Apr 23 13:48:41.368227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.368207 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/2.log" Apr 23 13:48:41.368425 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.368266 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:41.550842 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.550752 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-model-cache\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551041 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.550893 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72829302-9455-4fb2-b433-1f167afaebb7-tls-certs\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551041 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.550976 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-kserve-provision-location\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551041 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.550978 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-model-cache" (OuterVolumeSpecName: "model-cache") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:41.551222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551203 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-home\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551283 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551238 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-tmp-dir\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551283 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551276 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-dshm\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551423 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551319 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-rlgv6\" (UniqueName: \"kubernetes.io/projected/72829302-9455-4fb2-b433-1f167afaebb7-kube-api-access-rlgv6\") pod \"72829302-9455-4fb2-b433-1f167afaebb7\" (UID: \"72829302-9455-4fb2-b433-1f167afaebb7\") " Apr 23 13:48:41.551530 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551321 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:41.551530 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551510 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:41.551666 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551637 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-home" (OuterVolumeSpecName: "home") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:41.551884 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551862 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:41.551999 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551890 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:41.551999 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551950 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:41.551999 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.551967 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:41.553641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.553600 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-dshm" (OuterVolumeSpecName: "dshm") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:48:41.553732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.553638 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/72829302-9455-4fb2-b433-1f167afaebb7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:48:41.554039 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.554019 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/72829302-9455-4fb2-b433-1f167afaebb7-kube-api-access-rlgv6" (OuterVolumeSpecName: "kube-api-access-rlgv6") pod "72829302-9455-4fb2-b433-1f167afaebb7" (UID: "72829302-9455-4fb2-b433-1f167afaebb7"). InnerVolumeSpecName "kube-api-access-rlgv6". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:48:41.653315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.653280 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/72829302-9455-4fb2-b433-1f167afaebb7-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:41.653315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.653309 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-rlgv6\" (UniqueName: \"kubernetes.io/projected/72829302-9455-4fb2-b433-1f167afaebb7-kube-api-access-rlgv6\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:41.653315 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:41.653320 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/72829302-9455-4fb2-b433-1f167afaebb7-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:48:42.227474 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.227441 2576 generic.go:358] "Generic (PLEG): container finished" podID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerID="f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257" exitCode=0 Apr 23 13:48:42.227843 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.227509 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" event={"ID":"175b6e6d-865c-455b-b6f1-c0e3f6efe504","Type":"ContainerDied","Data":"f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257"} Apr 23 13:48:42.228784 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.228729 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_conv-test-lora-crit-kserve-746cf5776b-dcwfn_72829302-9455-4fb2-b433-1f167afaebb7/storage-initializer/2.log" Apr 23 13:48:42.228872 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.228840 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" event={"ID":"72829302-9455-4fb2-b433-1f167afaebb7","Type":"ContainerDied","Data":"1ca591160447997a62fdce4fe97e8d4e8fb1c735364f875b043172ba1a4c03ef"} Apr 23 13:48:42.228872 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.228843 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn" Apr 23 13:48:42.228968 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.228873 2576 scope.go:117] "RemoveContainer" containerID="f831312da015c91a09f1820e31355882b1f4ca3e0e1bf0b3f089ce28ef0ab773" Apr 23 13:48:42.274666 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.274637 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn"] Apr 23 13:48:42.278198 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:42.278167 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/conv-test-lora-crit-kserve-746cf5776b-dcwfn"] Apr 23 13:48:43.234725 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:43.234693 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" event={"ID":"175b6e6d-865c-455b-b6f1-c0e3f6efe504","Type":"ContainerStarted","Data":"56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882"} Apr 23 13:48:43.255081 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:43.255037 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podStartSLOduration=6.255024133 podStartE2EDuration="6.255024133s" podCreationTimestamp="2026-04-23 13:48:37 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:48:43.25379032 +0000 UTC m=+1044.001792293" watchObservedRunningTime="2026-04-23 13:48:43.255024133 +0000 UTC m=+1044.003026087" Apr 23 13:48:43.826207 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:43.826174 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="72829302-9455-4fb2-b433-1f167afaebb7" path="/var/lib/kubelet/pods/72829302-9455-4fb2-b433-1f167afaebb7/volumes" Apr 23 13:48:47.311678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:47.311642 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb"] Apr 23 13:48:47.312140 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:47.312008 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" containerID="cri-o://d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563" gracePeriod=30 Apr 23 13:48:48.102885 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:48.102844 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:48.103064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:48.102901 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:48:48.104416 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:48.104382 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:48:58.102688 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:58.102651 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:48:59.586406 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586362 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm"] Apr 23 13:48:59.586921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586902 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587016 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586926 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587016 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586940 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587016 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586949 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587016 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586960 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587016 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.586969 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587182 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.587087 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587182 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.587102 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.587182 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.587113 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="72829302-9455-4fb2-b433-1f167afaebb7" containerName="storage-initializer" Apr 23 13:48:59.592476 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.592454 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.595195 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.595172 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-test-kserve-self-signed-certs\"" Apr 23 13:48:59.608355 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.607189 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm"] Apr 23 13:48:59.715098 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715064 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-home\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.715098 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715099 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-model-cache\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.715286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715116 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-dshm\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.715286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.715286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715226 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-tmp-dir\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.715286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715257 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb19795-2224-4f0c-a4d1-71c49b639c48-tls-certs\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.715286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.715281 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-sd5pc\" (UniqueName: \"kubernetes.io/projected/9cb19795-2224-4f0c-a4d1-71c49b639c48-kube-api-access-sd5pc\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816571 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816528 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816746 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816624 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-tmp-dir\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816746 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816675 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb19795-2224-4f0c-a4d1-71c49b639c48-tls-certs\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816746 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816714 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-sd5pc\" (UniqueName: \"kubernetes.io/projected/9cb19795-2224-4f0c-a4d1-71c49b639c48-kube-api-access-sd5pc\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816778 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-home\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816805 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-model-cache\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.816915 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-dshm\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.817072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.816960 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-kserve-provision-location\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.817072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.817000 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-tmp-dir\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.817236 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.817212 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-home\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.817236 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.817232 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-model-cache\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.819095 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.819070 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-dshm\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.819748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.819718 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb19795-2224-4f0c-a4d1-71c49b639c48-tls-certs\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.832628 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.832607 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-sd5pc\" (UniqueName: \"kubernetes.io/projected/9cb19795-2224-4f0c-a4d1-71c49b639c48-kube-api-access-sd5pc\") pod \"custom-route-timeout-test-kserve-74d9947b58-qwnmm\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:48:59.913409 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:48:59.913309 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:49:00.284948 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:00.284919 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm"] Apr 23 13:49:00.285434 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:49:00.285412 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod9cb19795_2224_4f0c_a4d1_71c49b639c48.slice/crio-63bc733519610357d7706e4336f8a58acce7ded8cb16eecac9240003d905e1ff WatchSource:0}: Error finding container 63bc733519610357d7706e4336f8a58acce7ded8cb16eecac9240003d905e1ff: Status 404 returned error can't find the container with id 63bc733519610357d7706e4336f8a58acce7ded8cb16eecac9240003d905e1ff Apr 23 13:49:00.299549 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:00.299519 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" event={"ID":"9cb19795-2224-4f0c-a4d1-71c49b639c48","Type":"ContainerStarted","Data":"63bc733519610357d7706e4336f8a58acce7ded8cb16eecac9240003d905e1ff"} Apr 23 13:49:01.305257 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:01.305219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" event={"ID":"9cb19795-2224-4f0c-a4d1-71c49b639c48","Type":"ContainerStarted","Data":"d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91"} Apr 23 13:49:04.318536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:04.318508 2576 generic.go:358] "Generic (PLEG): container finished" podID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerID="d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91" exitCode=0 Apr 23 13:49:04.318917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:04.318558 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" event={"ID":"9cb19795-2224-4f0c-a4d1-71c49b639c48","Type":"ContainerDied","Data":"d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91"} Apr 23 13:49:05.328869 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:05.328783 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" event={"ID":"9cb19795-2224-4f0c-a4d1-71c49b639c48","Type":"ContainerStarted","Data":"952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2"} Apr 23 13:49:05.353447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:05.353391 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podStartSLOduration=6.353375724 podStartE2EDuration="6.353375724s" podCreationTimestamp="2026-04-23 13:48:59 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:49:05.349997359 +0000 UTC m=+1066.097999314" watchObservedRunningTime="2026-04-23 13:49:05.353375724 +0000 UTC m=+1066.101377684" Apr 23 13:49:08.102808 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:08.102760 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:49:09.914268 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:09.914227 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:49:09.914672 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:09.914306 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:49:09.915445 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:09.915421 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:49:17.588114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.588092 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb_41df3cb8-d2a1-4fee-ad69-c580f942ab3b/main/0.log" Apr 23 13:49:17.588505 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.588488 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:49:17.683586 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tmp-dir\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.683775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683609 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-home\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.683775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683626 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kserve-provision-location\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.683775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683645 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-dshm\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.683775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683683 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-model-cache\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.683775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683711 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tls-certs\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.684043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.683780 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-v798q\" (UniqueName: \"kubernetes.io/projected/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kube-api-access-v798q\") pod \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\" (UID: \"41df3cb8-d2a1-4fee-ad69-c580f942ab3b\") " Apr 23 13:49:17.684238 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.684214 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-home" (OuterVolumeSpecName: "home") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:49:17.684696 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.684644 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-model-cache" (OuterVolumeSpecName: "model-cache") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:49:17.686485 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.686423 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kube-api-access-v798q" (OuterVolumeSpecName: "kube-api-access-v798q") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "kube-api-access-v798q". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:49:17.687708 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.687681 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-dshm" (OuterVolumeSpecName: "dshm") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:49:17.687708 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.687695 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:49:17.701838 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.701810 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:49:17.735194 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.735144 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "41df3cb8-d2a1-4fee-ad69-c580f942ab3b" (UID: "41df3cb8-d2a1-4fee-ad69-c580f942ab3b"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:49:17.785503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785471 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:17.785503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785504 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:17.785702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785517 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:17.785702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785529 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:17.785702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785543 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:17.785702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785555 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:17.785702 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:17.785566 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-v798q\" (UniqueName: \"kubernetes.io/projected/41df3cb8-d2a1-4fee-ad69-c580f942ab3b-kube-api-access-v798q\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:49:18.103473 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.103376 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:49:18.384514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.384429 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb_41df3cb8-d2a1-4fee-ad69-c580f942ab3b/main/0.log" Apr 23 13:49:18.384784 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.384760 2576 generic.go:358] "Generic (PLEG): container finished" podID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerID="d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563" exitCode=137 Apr 23 13:49:18.384888 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.384845 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" event={"ID":"41df3cb8-d2a1-4fee-ad69-c580f942ab3b","Type":"ContainerDied","Data":"d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563"} Apr 23 13:49:18.384888 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.384886 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" event={"ID":"41df3cb8-d2a1-4fee-ad69-c580f942ab3b","Type":"ContainerDied","Data":"41c7c4ffa74b5632d5b517431b13c68fb6a2197f2a5c2816cea271043c620da4"} Apr 23 13:49:18.385004 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.384902 2576 scope.go:117] "RemoveContainer" containerID="d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563" Apr 23 13:49:18.385004 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.384857 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb" Apr 23 13:49:18.393992 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.393975 2576 scope.go:117] "RemoveContainer" containerID="584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0" Apr 23 13:49:18.408923 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.408892 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb"] Apr 23 13:49:18.414622 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.414599 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-55f7ae4a-kserve-8458695698kk6jb"] Apr 23 13:49:18.452450 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.452426 2576 scope.go:117] "RemoveContainer" containerID="d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563" Apr 23 13:49:18.452750 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:49:18.452731 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563\": container with ID starting with d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563 not found: ID does not exist" containerID="d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563" Apr 23 13:49:18.452815 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.452758 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563"} err="failed to get container status \"d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563\": rpc error: code = NotFound desc = could not find container \"d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563\": container with ID starting with d702cf33b8ec07aa7ecee857cd19e528e8758e28786f58a25479582c449d4563 not found: ID does not exist" Apr 23 13:49:18.452815 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.452776 2576 scope.go:117] "RemoveContainer" containerID="584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0" Apr 23 13:49:18.453059 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:49:18.453031 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0\": container with ID starting with 584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0 not found: ID does not exist" containerID="584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0" Apr 23 13:49:18.453113 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:18.453069 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0"} err="failed to get container status \"584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0\": rpc error: code = NotFound desc = could not find container \"584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0\": container with ID starting with 584c9cb1ddf1d5f8eae727e7cb0d79a6bb67cc0b702638fb0a57fd9b3e689da0 not found: ID does not exist" Apr 23 13:49:19.827285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:19.827253 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" path="/var/lib/kubelet/pods/41df3cb8-d2a1-4fee-ad69-c580f942ab3b/volumes" Apr 23 13:49:19.914690 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:19.914619 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:49:28.102940 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:28.102884 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:49:29.914609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:29.914564 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:49:38.102731 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:38.102673 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:49:39.914243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:39.914192 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:49:48.102600 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:48.102555 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:49:49.914218 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:49.914180 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:49:58.103031 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:58.102989 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:49:59.914388 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:49:59.914350 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:50:08.102995 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:08.102951 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:50:09.914037 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:09.913992 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:50:18.103009 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:18.102963 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:50:19.914168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:19.914122 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:50:28.103221 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:28.103177 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:50:29.914574 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:29.914519 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:50:38.103163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:38.103071 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" probeResult="failure" output="Get \"https://10.134.0.45:8000/health\": dial tcp 10.134.0.45:8000: connect: connection refused" Apr 23 13:50:39.914493 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:39.914448 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" probeResult="failure" output="Get \"https://10.134.0.46:8000/health\": dial tcp 10.134.0.46:8000: connect: connection refused" Apr 23 13:50:48.112517 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:48.112478 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:50:48.120032 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:48.120004 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:50:49.561622 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:49.561592 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc"] Apr 23 13:50:49.701848 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:49.701788 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" containerID="cri-o://56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882" gracePeriod=30 Apr 23 13:50:49.925377 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:49.925285 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:50:49.933086 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:50:49.933059 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:51:00.097923 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:00.097886 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm"] Apr 23 13:51:00.098426 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:00.098245 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" containerID="cri-o://952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2" gracePeriod=30 Apr 23 13:51:08.178376 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.178300 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h"] Apr 23 13:51:08.178857 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.178831 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="storage-initializer" Apr 23 13:51:08.178857 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.178857 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="storage-initializer" Apr 23 13:51:08.179007 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.178870 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" Apr 23 13:51:08.179007 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.178877 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" Apr 23 13:51:08.179007 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.178961 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="41df3cb8-d2a1-4fee-ad69-c580f942ab3b" containerName="main" Apr 23 13:51:08.183731 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.183709 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.186498 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.186466 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"router-with-refs-test-kserve-self-signed-certs\"" Apr 23 13:51:08.195822 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.195797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h"] Apr 23 13:51:08.279931 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.279906 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-tmp-dir\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.280094 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.279946 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-kserve-provision-location\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.280094 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.279977 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-t2sg9\" (UniqueName: \"kubernetes.io/projected/dd8ed085-bf83-463a-87ad-0516925e8af3-kube-api-access-t2sg9\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.280094 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.280058 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-home\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.280094 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.280088 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-model-cache\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.280231 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.280115 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8ed085-bf83-463a-87ad-0516925e8af3-tls-certs\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.280231 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.280132 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-dshm\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380742 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380708 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8ed085-bf83-463a-87ad-0516925e8af3-tls-certs\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380742 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380742 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-dshm\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380955 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380788 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-tmp-dir\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380955 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380813 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-kserve-provision-location\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380955 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380834 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-t2sg9\" (UniqueName: \"kubernetes.io/projected/dd8ed085-bf83-463a-87ad-0516925e8af3-kube-api-access-t2sg9\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380955 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380874 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-home\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.380955 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.380928 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-model-cache\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.381239 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.381219 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-home\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.381305 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.381273 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-tmp-dir\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.381399 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.381325 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-kserve-provision-location\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.381399 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.381385 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-model-cache\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.383046 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.383025 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-dshm\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.383320 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.383302 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8ed085-bf83-463a-87ad-0516925e8af3-tls-certs\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.389651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.389629 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-t2sg9\" (UniqueName: \"kubernetes.io/projected/dd8ed085-bf83-463a-87ad-0516925e8af3-kube-api-access-t2sg9\") pod \"router-with-refs-test-kserve-58684bc86b-57s8h\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.497026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.496944 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:08.623521 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.623494 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h"] Apr 23 13:51:08.625444 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:51:08.625416 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poddd8ed085_bf83_463a_87ad_0516925e8af3.slice/crio-2915b58e8b5a00f61b2a6ae2dee4c5e06caaf6e82fe7e1d35ebefb4e94bddf5c WatchSource:0}: Error finding container 2915b58e8b5a00f61b2a6ae2dee4c5e06caaf6e82fe7e1d35ebefb4e94bddf5c: Status 404 returned error can't find the container with id 2915b58e8b5a00f61b2a6ae2dee4c5e06caaf6e82fe7e1d35ebefb4e94bddf5c Apr 23 13:51:08.769519 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.769442 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" event={"ID":"dd8ed085-bf83-463a-87ad-0516925e8af3","Type":"ContainerStarted","Data":"202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588"} Apr 23 13:51:08.769519 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:08.769478 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" event={"ID":"dd8ed085-bf83-463a-87ad-0516925e8af3","Type":"ContainerStarted","Data":"2915b58e8b5a00f61b2a6ae2dee4c5e06caaf6e82fe7e1d35ebefb4e94bddf5c"} Apr 23 13:51:12.786786 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:12.786750 2576 generic.go:358] "Generic (PLEG): container finished" podID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerID="202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588" exitCode=0 Apr 23 13:51:12.787111 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:12.786821 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" event={"ID":"dd8ed085-bf83-463a-87ad-0516925e8af3","Type":"ContainerDied","Data":"202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588"} Apr 23 13:51:13.793747 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:13.793715 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" event={"ID":"dd8ed085-bf83-463a-87ad-0516925e8af3","Type":"ContainerStarted","Data":"d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e"} Apr 23 13:51:13.813845 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:13.813796 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podStartSLOduration=5.813780127 podStartE2EDuration="5.813780127s" podCreationTimestamp="2026-04-23 13:51:08 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:51:13.812191169 +0000 UTC m=+1194.560193123" watchObservedRunningTime="2026-04-23 13:51:13.813780127 +0000 UTC m=+1194.561782084" Apr 23 13:51:18.498033 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:18.497995 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:18.498033 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:18.498037 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:51:18.499577 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:18.499548 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:51:19.835470 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:19.835444 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:51:19.835809 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:19.835445 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:51:19.980639 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:19.980612 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76fc9bdd58-xgckc_175b6e6d-865c-455b-b6f1-c0e3f6efe504/main/0.log" Apr 23 13:51:19.980985 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:19.980968 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:51:20.084481 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084449 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kserve-provision-location\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.084655 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084493 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-gbjbc\" (UniqueName: \"kubernetes.io/projected/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kube-api-access-gbjbc\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.084655 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084528 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tmp-dir\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.084655 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084584 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-dshm\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.084655 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084646 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-home\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.084859 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084673 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tls-certs\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.084859 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.084701 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-model-cache\") pod \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\" (UID: \"175b6e6d-865c-455b-b6f1-c0e3f6efe504\") " Apr 23 13:51:20.085202 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.085174 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-model-cache" (OuterVolumeSpecName: "model-cache") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:20.085647 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.085613 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-home" (OuterVolumeSpecName: "home") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:20.087188 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.087161 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-dshm" (OuterVolumeSpecName: "dshm") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:20.087288 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.087255 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kube-api-access-gbjbc" (OuterVolumeSpecName: "kube-api-access-gbjbc") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "kube-api-access-gbjbc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:20.087416 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.087397 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:20.097179 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.097153 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:20.143231 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.143193 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "175b6e6d-865c-455b-b6f1-c0e3f6efe504" (UID: "175b6e6d-865c-455b-b6f1-c0e3f6efe504"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:20.185559 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185532 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.185559 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185557 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.185732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185566 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.185732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185577 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.185732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185585 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.185732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185594 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-gbjbc\" (UniqueName: \"kubernetes.io/projected/175b6e6d-865c-455b-b6f1-c0e3f6efe504-kube-api-access-gbjbc\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.185732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.185605 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/175b6e6d-865c-455b-b6f1-c0e3f6efe504-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:20.819840 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.819811 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76fc9bdd58-xgckc_175b6e6d-865c-455b-b6f1-c0e3f6efe504/main/0.log" Apr 23 13:51:20.820153 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.820128 2576 generic.go:358] "Generic (PLEG): container finished" podID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerID="56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882" exitCode=137 Apr 23 13:51:20.820232 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.820198 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" event={"ID":"175b6e6d-865c-455b-b6f1-c0e3f6efe504","Type":"ContainerDied","Data":"56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882"} Apr 23 13:51:20.820277 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.820261 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" event={"ID":"175b6e6d-865c-455b-b6f1-c0e3f6efe504","Type":"ContainerDied","Data":"5197c04eb488a8a1c5d8276bf31f148ac7a61da7c4e0299a7838b136e3065835"} Apr 23 13:51:20.820344 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.820201 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc" Apr 23 13:51:20.820344 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.820307 2576 scope.go:117] "RemoveContainer" containerID="56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882" Apr 23 13:51:20.831044 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.831015 2576 scope.go:117] "RemoveContainer" containerID="f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257" Apr 23 13:51:20.850522 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.850496 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc"] Apr 23 13:51:20.852041 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.852020 2576 scope.go:117] "RemoveContainer" containerID="56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882" Apr 23 13:51:20.852398 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:51:20.852373 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882\": container with ID starting with 56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882 not found: ID does not exist" containerID="56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882" Apr 23 13:51:20.852658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.852434 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882"} err="failed to get container status \"56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882\": rpc error: code = NotFound desc = could not find container \"56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882\": container with ID starting with 56d3301b89640ab71b1d03a918a9ff15297ec03e0319dff3add9794483719882 not found: ID does not exist" Apr 23 13:51:20.852658 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.852455 2576 scope.go:117] "RemoveContainer" containerID="f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257" Apr 23 13:51:20.852790 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:51:20.852763 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257\": container with ID starting with f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257 not found: ID does not exist" containerID="f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257" Apr 23 13:51:20.852848 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.852795 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257"} err="failed to get container status \"f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257\": rpc error: code = NotFound desc = could not find container \"f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257\": container with ID starting with f69c88693fe6a20d243ed780b7005edcd35c8628844661c5f4ac28a502377257 not found: ID does not exist" Apr 23 13:51:20.855423 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:20.855401 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-xgckc"] Apr 23 13:51:21.828996 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:21.828961 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" path="/var/lib/kubelet/pods/175b6e6d-865c-455b-b6f1-c0e3f6efe504/volumes" Apr 23 13:51:23.277892 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.277861 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2"] Apr 23 13:51:23.278356 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.278278 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="storage-initializer" Apr 23 13:51:23.278356 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.278291 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="storage-initializer" Apr 23 13:51:23.278356 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.278310 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" Apr 23 13:51:23.278356 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.278316 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" Apr 23 13:51:23.278597 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.278395 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="175b6e6d-865c-455b-b6f1-c0e3f6efe504" containerName="main" Apr 23 13:51:23.315709 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.315683 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2"] Apr 23 13:51:23.315859 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.315791 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.318441 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.318416 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"stop-feature-test-kserve-self-signed-certs\"" Apr 23 13:51:23.411390 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r6bl\" (UniqueName: \"kubernetes.io/projected/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kube-api-access-7r6bl\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.411523 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411420 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tls-certs\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.411523 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411455 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-home\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.411523 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411473 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-dshm\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.411641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411532 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-model-cache\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.411641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411551 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tmp-dir\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.411641 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.411582 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kserve-provision-location\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512439 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512413 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-home\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512445 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-dshm\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512473 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-model-cache\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512498 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tmp-dir\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512525 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kserve-provision-location\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512587 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r6bl\" (UniqueName: \"kubernetes.io/projected/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kube-api-access-7r6bl\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512829 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512643 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tls-certs\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512884 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512838 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-home\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512936 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512884 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-model-cache\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.512999 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.512971 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tmp-dir\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.513079 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.513064 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kserve-provision-location\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.514697 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.514671 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-dshm\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.515064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.515043 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tls-certs\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.521135 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.521112 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r6bl\" (UniqueName: \"kubernetes.io/projected/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kube-api-access-7r6bl\") pod \"stop-feature-test-kserve-76fc9bdd58-z5wk2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.625460 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.625422 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:23.960518 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:23.960493 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2"] Apr 23 13:51:23.962668 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:51:23.962642 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-podd30a9fe8_5aa8_4d7e_805c_dd8780c2d7f2.slice/crio-baeec66a3d818f37f6977bb3eeed6aaf6ba76b77fc98af5b008cd762e75097ee WatchSource:0}: Error finding container baeec66a3d818f37f6977bb3eeed6aaf6ba76b77fc98af5b008cd762e75097ee: Status 404 returned error can't find the container with id baeec66a3d818f37f6977bb3eeed6aaf6ba76b77fc98af5b008cd762e75097ee Apr 23 13:51:24.839893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:24.839859 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" event={"ID":"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2","Type":"ContainerStarted","Data":"d689ce8c62322cda5cf0d0a262f105dd9494f0ae1508de3daeb1e02c62bcbaec"} Apr 23 13:51:24.839893 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:24.839900 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" event={"ID":"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2","Type":"ContainerStarted","Data":"baeec66a3d818f37f6977bb3eeed6aaf6ba76b77fc98af5b008cd762e75097ee"} Apr 23 13:51:28.498263 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:28.498223 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:51:28.855583 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:28.855547 2576 generic.go:358] "Generic (PLEG): container finished" podID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerID="d689ce8c62322cda5cf0d0a262f105dd9494f0ae1508de3daeb1e02c62bcbaec" exitCode=0 Apr 23 13:51:28.855742 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:28.855586 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" event={"ID":"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2","Type":"ContainerDied","Data":"d689ce8c62322cda5cf0d0a262f105dd9494f0ae1508de3daeb1e02c62bcbaec"} Apr 23 13:51:29.863533 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:29.863496 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" event={"ID":"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2","Type":"ContainerStarted","Data":"07ca57736feab7730b4e98be298bbac145029e6f015b1981c31af7db308a2519"} Apr 23 13:51:29.887801 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:29.887746 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podStartSLOduration=6.887728444 podStartE2EDuration="6.887728444s" podCreationTimestamp="2026-04-23 13:51:23 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:51:29.884231533 +0000 UTC m=+1210.632233489" watchObservedRunningTime="2026-04-23 13:51:29.887728444 +0000 UTC m=+1210.635730400" Apr 23 13:51:30.389598 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.389575 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-74d9947b58-qwnmm_9cb19795-2224-4f0c-a4d1-71c49b639c48/main/0.log" Apr 23 13:51:30.390047 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.390024 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:51:30.480765 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.480688 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-kserve-provision-location\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.480765 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.480740 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-dshm\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.480765 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.480759 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb19795-2224-4f0c-a4d1-71c49b639c48-tls-certs\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.481046 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.480787 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-model-cache\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.481046 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.480842 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-home\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.481153 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.481055 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-model-cache" (OuterVolumeSpecName: "model-cache") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:30.481223 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.481187 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-tmp-dir\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.481286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.481271 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-sd5pc\" (UniqueName: \"kubernetes.io/projected/9cb19795-2224-4f0c-a4d1-71c49b639c48-kube-api-access-sd5pc\") pod \"9cb19795-2224-4f0c-a4d1-71c49b639c48\" (UID: \"9cb19795-2224-4f0c-a4d1-71c49b639c48\") " Apr 23 13:51:30.481604 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.481578 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-home" (OuterVolumeSpecName: "home") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:30.481697 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.481627 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.483083 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.483030 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-dshm" (OuterVolumeSpecName: "dshm") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:30.483657 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.483585 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/9cb19795-2224-4f0c-a4d1-71c49b639c48-kube-api-access-sd5pc" (OuterVolumeSpecName: "kube-api-access-sd5pc") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "kube-api-access-sd5pc". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:51:30.483756 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.483659 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/9cb19795-2224-4f0c-a4d1-71c49b639c48-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:51:30.499608 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.499579 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:30.517456 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.517415 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "9cb19795-2224-4f0c-a4d1-71c49b639c48" (UID: "9cb19795-2224-4f0c-a4d1-71c49b639c48"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:51:30.582243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.582212 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-sd5pc\" (UniqueName: \"kubernetes.io/projected/9cb19795-2224-4f0c-a4d1-71c49b639c48-kube-api-access-sd5pc\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.582243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.582239 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.582243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.582249 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.582458 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.582258 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/9cb19795-2224-4f0c-a4d1-71c49b639c48-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.582458 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.582268 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.582458 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.582275 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/9cb19795-2224-4f0c-a4d1-71c49b639c48-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:51:30.868966 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.868941 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-test-kserve-74d9947b58-qwnmm_9cb19795-2224-4f0c-a4d1-71c49b639c48/main/0.log" Apr 23 13:51:30.869393 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.869345 2576 generic.go:358] "Generic (PLEG): container finished" podID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerID="952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2" exitCode=137 Apr 23 13:51:30.869448 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.869418 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" event={"ID":"9cb19795-2224-4f0c-a4d1-71c49b639c48","Type":"ContainerDied","Data":"952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2"} Apr 23 13:51:30.869448 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.869432 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" Apr 23 13:51:30.869527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.869456 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm" event={"ID":"9cb19795-2224-4f0c-a4d1-71c49b639c48","Type":"ContainerDied","Data":"63bc733519610357d7706e4336f8a58acce7ded8cb16eecac9240003d905e1ff"} Apr 23 13:51:30.869527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.869472 2576 scope.go:117] "RemoveContainer" containerID="952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2" Apr 23 13:51:30.877967 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.877945 2576 scope.go:117] "RemoveContainer" containerID="d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91" Apr 23 13:51:30.893074 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.893055 2576 scope.go:117] "RemoveContainer" containerID="952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2" Apr 23 13:51:30.893414 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:51:30.893389 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2\": container with ID starting with 952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2 not found: ID does not exist" containerID="952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2" Apr 23 13:51:30.893497 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.893426 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2"} err="failed to get container status \"952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2\": rpc error: code = NotFound desc = could not find container \"952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2\": container with ID starting with 952c1b95dfd177be5e96a8db0b060373e5ee96c8aebbd74a74195f8a1a40b7d2 not found: ID does not exist" Apr 23 13:51:30.893497 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.893450 2576 scope.go:117] "RemoveContainer" containerID="d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91" Apr 23 13:51:30.893782 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:51:30.893758 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91\": container with ID starting with d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91 not found: ID does not exist" containerID="d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91" Apr 23 13:51:30.893858 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.893790 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91"} err="failed to get container status \"d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91\": rpc error: code = NotFound desc = could not find container \"d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91\": container with ID starting with d09187d796da7364f2561da78ab9e99c388a0a5958e8b79dcab90c885ed58c91 not found: ID does not exist" Apr 23 13:51:30.896746 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.896722 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm"] Apr 23 13:51:30.900965 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:30.900937 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-test-kserve-74d9947b58-qwnmm"] Apr 23 13:51:31.825749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:31.825719 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" path="/var/lib/kubelet/pods/9cb19795-2224-4f0c-a4d1-71c49b639c48/volumes" Apr 23 13:51:33.626168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:33.626132 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:33.626168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:33.626174 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:51:33.627675 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:33.627643 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:51:38.498483 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:38.498443 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:51:43.626387 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:43.626320 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:51:48.497384 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:48.497319 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:51:53.626230 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:53.626183 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:51:58.497797 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:51:58.497748 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:52:03.626811 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:03.626757 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:52:08.497759 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:08.497668 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:52:13.626367 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:13.626293 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:52:18.497872 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:18.497820 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:52:23.626525 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:23.626486 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:52:28.498285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:28.498240 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:52:33.626575 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:33.626526 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:52:38.498227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:38.498169 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:52:43.626596 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:43.626552 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:52:48.497392 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:48.497318 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:52:53.626050 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:53.626009 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:52:58.497673 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:52:58.497634 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" probeResult="failure" output="Get \"https://10.134.0.47:8000/health\": dial tcp 10.134.0.47:8000: connect: connection refused" Apr 23 13:53:03.626421 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:03.626375 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:53:08.507723 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:08.507689 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:53:08.515191 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:08.515165 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:53:13.626640 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:13.626588 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" probeResult="failure" output="Get \"https://10.134.0.48:8000/health\": dial tcp 10.134.0.48:8000: connect: connection refused" Apr 23 13:53:17.519770 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:17.519733 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h"] Apr 23 13:53:17.520661 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:17.520621 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" containerID="cri-o://d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e" gracePeriod=30 Apr 23 13:53:23.636058 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:23.636024 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:53:23.643389 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:23.643370 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:53:33.588226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.588196 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw"] Apr 23 13:53:33.588626 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.588611 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" Apr 23 13:53:33.588676 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.588627 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" Apr 23 13:53:33.588676 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.588642 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="storage-initializer" Apr 23 13:53:33.588676 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.588648 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="storage-initializer" Apr 23 13:53:33.588814 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.588702 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="9cb19795-2224-4f0c-a4d1-71c49b639c48" containerName="main" Apr 23 13:53:33.594083 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.594065 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.596753 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.596733 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-dockercfg-jt77c\"" Apr 23 13:53:33.596873 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.596781 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8f1a6f044e8c7a4d31a250e0c4861caf-kserve-self-signed-certs\"" Apr 23 13:53:33.602254 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.602233 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw"] Apr 23 13:53:33.611560 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.611528 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9"] Apr 23 13:53:33.615163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.615146 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.627510 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.627489 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9"] Apr 23 13:53:33.715125 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715092 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.715300 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715134 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715300 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715150 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.715300 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715238 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/588a69c5-5a12-4037-b03f-725fdbee0106-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715300 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715276 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715314 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715361 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-kqlsz\" (UniqueName: \"kubernetes.io/projected/588a69c5-5a12-4037-b03f-725fdbee0106-kube-api-access-kqlsz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715380 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.715536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715397 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k8kdh\" (UniqueName: \"kubernetes.io/projected/682e1451-a1f8-41f8-a224-9a8f7ff69531-kube-api-access-k8kdh\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.715536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715418 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715709 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715530 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.715709 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715558 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/682e1451-a1f8-41f8-a224-9a8f7ff69531-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.715709 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715595 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.715709 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.715626 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816083 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816059 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816217 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816098 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816217 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816123 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816217 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816139 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816402 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816230 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/588a69c5-5a12-4037-b03f-725fdbee0106-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816402 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816276 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816404 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816453 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-kqlsz\" (UniqueName: \"kubernetes.io/projected/588a69c5-5a12-4037-b03f-725fdbee0106-kube-api-access-kqlsz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816467 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816476 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816483 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816534 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k8kdh\" (UniqueName: \"kubernetes.io/projected/682e1451-a1f8-41f8-a224-9a8f7ff69531-kube-api-access-k8kdh\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816546 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816578 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816614 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816642 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/682e1451-a1f8-41f8-a224-9a8f7ff69531-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.816760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816698 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.817099 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816811 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.817099 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.816973 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.817099 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.817067 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-home\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.817285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.817262 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.818492 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.818472 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.818625 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.818603 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-dshm\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.818823 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.818798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/588a69c5-5a12-4037-b03f-725fdbee0106-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.819114 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.819095 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/682e1451-a1f8-41f8-a224-9a8f7ff69531-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.824617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.824593 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-kqlsz\" (UniqueName: \"kubernetes.io/projected/588a69c5-5a12-4037-b03f-725fdbee0106-kube-api-access-kqlsz\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.824816 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.824798 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k8kdh\" (UniqueName: \"kubernetes.io/projected/682e1451-a1f8-41f8-a224-9a8f7ff69531-kube-api-access-k8kdh\") pod \"llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:33.903656 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.903600 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:33.927435 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:33.927413 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:34.045817 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:34.045792 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw"] Apr 23 13:53:34.048122 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:53:34.048096 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod588a69c5_5a12_4037_b03f_725fdbee0106.slice/crio-fa308f4f04efb19cf47008aed9dbb682f9e04c72a7068b9f49c564c418f35988 WatchSource:0}: Error finding container fa308f4f04efb19cf47008aed9dbb682f9e04c72a7068b9f49c564c418f35988: Status 404 returned error can't find the container with id fa308f4f04efb19cf47008aed9dbb682f9e04c72a7068b9f49c564c418f35988 Apr 23 13:53:34.050638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:34.050400 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 13:53:34.066906 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:34.066887 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9"] Apr 23 13:53:34.068301 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:53:34.068281 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod682e1451_a1f8_41f8_a224_9a8f7ff69531.slice/crio-1a1f19808b22030a73b667975194e15a246c09e8ff4bc464b4ed2a15214a78be WatchSource:0}: Error finding container 1a1f19808b22030a73b667975194e15a246c09e8ff4bc464b4ed2a15214a78be: Status 404 returned error can't find the container with id 1a1f19808b22030a73b667975194e15a246c09e8ff4bc464b4ed2a15214a78be Apr 23 13:53:34.322455 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:34.322420 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" event={"ID":"682e1451-a1f8-41f8-a224-9a8f7ff69531","Type":"ContainerStarted","Data":"1767b8b2e0adf7d30763e27d65fe2b72f2c394929379dc5aec0191188837166c"} Apr 23 13:53:34.322624 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:34.322459 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" event={"ID":"682e1451-a1f8-41f8-a224-9a8f7ff69531","Type":"ContainerStarted","Data":"1a1f19808b22030a73b667975194e15a246c09e8ff4bc464b4ed2a15214a78be"} Apr 23 13:53:34.323503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:34.323458 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerStarted","Data":"fa308f4f04efb19cf47008aed9dbb682f9e04c72a7068b9f49c564c418f35988"} Apr 23 13:53:35.329921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:35.329784 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerStarted","Data":"f9b6148bc90a75213b25438f7439356fedefb96c99216eb30adfa521c8b2dd83"} Apr 23 13:53:36.335949 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:36.335907 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerStarted","Data":"b9c3ba5e5705473b001c1214f565be80bdd7e7b2a7ba252aa2424be6432e1505"} Apr 23 13:53:36.336678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:36.336318 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:39.348339 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:39.348306 2576 generic.go:358] "Generic (PLEG): container finished" podID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerID="1767b8b2e0adf7d30763e27d65fe2b72f2c394929379dc5aec0191188837166c" exitCode=0 Apr 23 13:53:39.348792 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:39.348368 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" event={"ID":"682e1451-a1f8-41f8-a224-9a8f7ff69531","Type":"ContainerDied","Data":"1767b8b2e0adf7d30763e27d65fe2b72f2c394929379dc5aec0191188837166c"} Apr 23 13:53:40.354357 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:40.354311 2576 generic.go:358] "Generic (PLEG): container finished" podID="588a69c5-5a12-4037-b03f-725fdbee0106" containerID="b9c3ba5e5705473b001c1214f565be80bdd7e7b2a7ba252aa2424be6432e1505" exitCode=0 Apr 23 13:53:40.354842 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:40.354386 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerDied","Data":"b9c3ba5e5705473b001c1214f565be80bdd7e7b2a7ba252aa2424be6432e1505"} Apr 23 13:53:40.356264 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:40.356244 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" event={"ID":"682e1451-a1f8-41f8-a224-9a8f7ff69531","Type":"ContainerStarted","Data":"04037263a4247664328367f1c5b6b39a6bc995a3fa5258be45e5832e0405c04c"} Apr 23 13:53:40.397522 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:40.397482 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podStartSLOduration=7.397469366 podStartE2EDuration="7.397469366s" podCreationTimestamp="2026-04-23 13:53:33 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:53:40.395421334 +0000 UTC m=+1341.143423290" watchObservedRunningTime="2026-04-23 13:53:40.397469366 +0000 UTC m=+1341.145471320" Apr 23 13:53:41.361364 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:41.361309 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerStarted","Data":"85d864dcb93fa4f906d7f1c90b676fc709183811798d43dbde86cf33a9d2a0ea"} Apr 23 13:53:41.386830 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:41.386773 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podStartSLOduration=7.449871282 podStartE2EDuration="8.386756032s" podCreationTimestamp="2026-04-23 13:53:33 +0000 UTC" firstStartedPulling="2026-04-23 13:53:34.05061742 +0000 UTC m=+1334.798619353" lastFinishedPulling="2026-04-23 13:53:34.98750217 +0000 UTC m=+1335.735504103" observedRunningTime="2026-04-23 13:53:41.384532876 +0000 UTC m=+1342.132534831" watchObservedRunningTime="2026-04-23 13:53:41.386756032 +0000 UTC m=+1342.134757990" Apr 23 13:53:43.904147 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:43.904094 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:43.904147 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:43.904156 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:43.905426 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:43.905396 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:53:43.928610 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:43.928584 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:43.928763 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:43.928624 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:53:43.930068 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:43.930037 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:53:44.250781 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:44.250688 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2"] Apr 23 13:53:44.251116 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:44.251087 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" containerID="cri-o://07ca57736feab7730b4e98be298bbac145029e6f015b1981c31af7db308a2519" gracePeriod=30 Apr 23 13:53:47.785921 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.785899 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-58684bc86b-57s8h_dd8ed085-bf83-463a-87ad-0516925e8af3/main/0.log" Apr 23 13:53:47.786348 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.786315 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:53:47.856014 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.855983 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-tmp-dir\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856198 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856031 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-model-cache\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856198 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856072 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-t2sg9\" (UniqueName: \"kubernetes.io/projected/dd8ed085-bf83-463a-87ad-0516925e8af3-kube-api-access-t2sg9\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856198 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856123 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8ed085-bf83-463a-87ad-0516925e8af3-tls-certs\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856198 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856181 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-dshm\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856434 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856243 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-home\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856434 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856292 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-kserve-provision-location\") pod \"dd8ed085-bf83-463a-87ad-0516925e8af3\" (UID: \"dd8ed085-bf83-463a-87ad-0516925e8af3\") " Apr 23 13:53:47.856434 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856343 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-model-cache" (OuterVolumeSpecName: "model-cache") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:47.856651 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.856627 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:47.857136 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.857109 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-home" (OuterVolumeSpecName: "home") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:47.858598 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.858570 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/dd8ed085-bf83-463a-87ad-0516925e8af3-kube-api-access-t2sg9" (OuterVolumeSpecName: "kube-api-access-t2sg9") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "kube-api-access-t2sg9". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:53:47.859627 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.859600 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/dd8ed085-bf83-463a-87ad-0516925e8af3-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:53:47.859760 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.859739 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-dshm" (OuterVolumeSpecName: "dshm") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:47.874165 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.874130 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:47.914621 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.914581 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "dd8ed085-bf83-463a-87ad-0516925e8af3" (UID: "dd8ed085-bf83-463a-87ad-0516925e8af3"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:53:47.957667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.957631 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:47.957667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.957659 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:47.957667 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.957670 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:47.957947 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.957679 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-t2sg9\" (UniqueName: \"kubernetes.io/projected/dd8ed085-bf83-463a-87ad-0516925e8af3-kube-api-access-t2sg9\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:47.957947 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.957688 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/dd8ed085-bf83-463a-87ad-0516925e8af3-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:47.957947 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:47.957696 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/dd8ed085-bf83-463a-87ad-0516925e8af3-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:53:48.386737 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.386704 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_router-with-refs-test-kserve-58684bc86b-57s8h_dd8ed085-bf83-463a-87ad-0516925e8af3/main/0.log" Apr 23 13:53:48.387098 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.387067 2576 generic.go:358] "Generic (PLEG): container finished" podID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerID="d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e" exitCode=137 Apr 23 13:53:48.387232 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.387152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" event={"ID":"dd8ed085-bf83-463a-87ad-0516925e8af3","Type":"ContainerDied","Data":"d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e"} Apr 23 13:53:48.387232 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.387190 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" event={"ID":"dd8ed085-bf83-463a-87ad-0516925e8af3","Type":"ContainerDied","Data":"2915b58e8b5a00f61b2a6ae2dee4c5e06caaf6e82fe7e1d35ebefb4e94bddf5c"} Apr 23 13:53:48.387232 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.387205 2576 scope.go:117] "RemoveContainer" containerID="d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e" Apr 23 13:53:48.387418 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.387162 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h" Apr 23 13:53:48.396189 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.396167 2576 scope.go:117] "RemoveContainer" containerID="202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588" Apr 23 13:53:48.410144 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.410119 2576 scope.go:117] "RemoveContainer" containerID="d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e" Apr 23 13:53:48.410426 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:53:48.410404 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e\": container with ID starting with d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e not found: ID does not exist" containerID="d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e" Apr 23 13:53:48.410514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.410431 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e"} err="failed to get container status \"d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e\": rpc error: code = NotFound desc = could not find container \"d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e\": container with ID starting with d8673087a7d262a0329224d947b348444e93c3e4c16f603324e90c1f6438c81e not found: ID does not exist" Apr 23 13:53:48.410514 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.410448 2576 scope.go:117] "RemoveContainer" containerID="202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588" Apr 23 13:53:48.410676 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:53:48.410660 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588\": container with ID starting with 202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588 not found: ID does not exist" containerID="202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588" Apr 23 13:53:48.410734 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.410679 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588"} err="failed to get container status \"202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588\": rpc error: code = NotFound desc = could not find container \"202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588\": container with ID starting with 202ac4cc0c216310f4f915fbe25231d849f50be2e46f53c044bcf5a435521588 not found: ID does not exist" Apr 23 13:53:48.419363 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.419322 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h"] Apr 23 13:53:48.422352 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:48.422311 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/router-with-refs-test-kserve-58684bc86b-57s8h"] Apr 23 13:53:49.825278 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:49.825233 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" path="/var/lib/kubelet/pods/dd8ed085-bf83-463a-87ad-0516925e8af3/volumes" Apr 23 13:53:53.904694 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:53.904637 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:53:53.922382 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:53.922355 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:53:53.927934 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:53:53.927898 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:03.904626 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:03.904584 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:54:03.928484 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:03.928442 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:13.905019 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:13.904974 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:54:13.928168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:13.928135 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:14.494125 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.494096 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76fc9bdd58-z5wk2_d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2/main/0.log" Apr 23 13:54:14.494527 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.494499 2576 generic.go:358] "Generic (PLEG): container finished" podID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerID="07ca57736feab7730b4e98be298bbac145029e6f015b1981c31af7db308a2519" exitCode=137 Apr 23 13:54:14.494638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.494553 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" event={"ID":"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2","Type":"ContainerDied","Data":"07ca57736feab7730b4e98be298bbac145029e6f015b1981c31af7db308a2519"} Apr 23 13:54:14.527234 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.527208 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76fc9bdd58-z5wk2_d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2/main/0.log" Apr 23 13:54:14.527690 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.527671 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:54:14.621494 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621457 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tls-certs\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.621747 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r6bl\" (UniqueName: \"kubernetes.io/projected/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kube-api-access-7r6bl\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.621747 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621530 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-dshm\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.621747 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621558 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kserve-provision-location\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.621747 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621691 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-home\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.622003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621785 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-model-cache\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.622003 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.621848 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tmp-dir\") pod \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\" (UID: \"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2\") " Apr 23 13:54:14.622116 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.622045 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-model-cache" (OuterVolumeSpecName: "model-cache") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:14.622169 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.622159 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:14.622272 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.622243 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-home" (OuterVolumeSpecName: "home") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:14.624430 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.624315 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-dshm" (OuterVolumeSpecName: "dshm") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:14.624430 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.624365 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kube-api-access-7r6bl" (OuterVolumeSpecName: "kube-api-access-7r6bl") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "kube-api-access-7r6bl". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:54:14.624623 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.624451 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:54:14.632897 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.632868 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:14.675654 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.675604 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" (UID: "d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:54:14.722988 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.722948 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:14.722988 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.722977 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7r6bl\" (UniqueName: \"kubernetes.io/projected/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kube-api-access-7r6bl\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:14.722988 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.722987 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:14.722988 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.722997 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:14.723349 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.723005 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:14.723349 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:14.723015 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:54:15.499581 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.499551 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_stop-feature-test-kserve-76fc9bdd58-z5wk2_d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2/main/0.log" Apr 23 13:54:15.500061 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.499883 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" event={"ID":"d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2","Type":"ContainerDied","Data":"baeec66a3d818f37f6977bb3eeed6aaf6ba76b77fc98af5b008cd762e75097ee"} Apr 23 13:54:15.500061 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.499923 2576 scope.go:117] "RemoveContainer" containerID="07ca57736feab7730b4e98be298bbac145029e6f015b1981c31af7db308a2519" Apr 23 13:54:15.500061 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.499991 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2" Apr 23 13:54:15.508361 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.508317 2576 scope.go:117] "RemoveContainer" containerID="d689ce8c62322cda5cf0d0a262f105dd9494f0ae1508de3daeb1e02c62bcbaec" Apr 23 13:54:15.527162 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.527138 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2"] Apr 23 13:54:15.530363 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.530318 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/stop-feature-test-kserve-76fc9bdd58-z5wk2"] Apr 23 13:54:15.826575 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:15.826538 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" path="/var/lib/kubelet/pods/d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2/volumes" Apr 23 13:54:23.904815 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:23.904766 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:54:23.928609 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:23.928574 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:33.904930 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:33.904884 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:54:33.928103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:33.928072 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:43.904201 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:43.904151 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:54:43.927944 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:43.927908 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:49.796305 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796272 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db"] Apr 23 13:54:49.796865 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796846 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="storage-initializer" Apr 23 13:54:49.796912 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796869 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="storage-initializer" Apr 23 13:54:49.796912 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796882 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" Apr 23 13:54:49.796912 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796890 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" Apr 23 13:54:49.796912 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796901 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="storage-initializer" Apr 23 13:54:49.796912 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796909 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="storage-initializer" Apr 23 13:54:49.797064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796930 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" Apr 23 13:54:49.797064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.796939 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" Apr 23 13:54:49.797064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.797045 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="d30a9fe8-5aa8-4d7e-805c-dd8780c2d7f2" containerName="main" Apr 23 13:54:49.797064 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.797059 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="dd8ed085-bf83-463a-87ad-0516925e8af3" containerName="main" Apr 23 13:54:49.800307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.800287 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.802888 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.802866 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv8de1d74aab16d9cabd8b5aafeb5248e8-kserve-self-signed-certs\"" Apr 23 13:54:49.811542 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.811520 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db"] Apr 23 13:54:49.852391 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852353 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.852535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852457 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.852535 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852524 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.852650 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852622 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.852706 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852676 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.852761 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852727 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.852826 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.852760 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lsm5r\" (UniqueName: \"kubernetes.io/projected/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kube-api-access-lsm5r\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.953729 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.953694 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.953917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.953757 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.953917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.953806 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.953917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.953858 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.953917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.953883 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.954160 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.954016 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lsm5r\" (UniqueName: \"kubernetes.io/projected/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kube-api-access-lsm5r\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.954160 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.954096 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.954160 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.954101 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.954343 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.954192 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.954343 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.954306 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.954472 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.954450 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-home\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.956067 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.956040 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-dshm\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.956183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.956157 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:49.962011 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:49.961985 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lsm5r\" (UniqueName: \"kubernetes.io/projected/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kube-api-access-lsm5r\") pod \"llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:50.112350 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:50.112257 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:54:50.244368 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:50.244338 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db"] Apr 23 13:54:50.246176 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:54:50.246146 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod641ddd3a_17b2_4dc1_a420_576fa2a331b7.slice/crio-f1e4549022cf48c6bed5c1056e44d1d9395e09367ed46a908f56de21946cf632 WatchSource:0}: Error finding container f1e4549022cf48c6bed5c1056e44d1d9395e09367ed46a908f56de21946cf632: Status 404 returned error can't find the container with id f1e4549022cf48c6bed5c1056e44d1d9395e09367ed46a908f56de21946cf632 Apr 23 13:54:50.630151 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:50.630101 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" event={"ID":"641ddd3a-17b2-4dc1-a420-576fa2a331b7","Type":"ContainerStarted","Data":"87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96"} Apr 23 13:54:50.630151 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:50.630152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" event={"ID":"641ddd3a-17b2-4dc1-a420-576fa2a331b7","Type":"ContainerStarted","Data":"f1e4549022cf48c6bed5c1056e44d1d9395e09367ed46a908f56de21946cf632"} Apr 23 13:54:53.904961 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:53.904917 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:54:53.928002 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:53.927970 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:54:54.646476 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:54.646447 2576 generic.go:358] "Generic (PLEG): container finished" podID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerID="87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96" exitCode=0 Apr 23 13:54:54.646644 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:54.646531 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" event={"ID":"641ddd3a-17b2-4dc1-a420-576fa2a331b7","Type":"ContainerDied","Data":"87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96"} Apr 23 13:54:55.652960 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:55.652924 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" event={"ID":"641ddd3a-17b2-4dc1-a420-576fa2a331b7","Type":"ContainerStarted","Data":"0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9"} Apr 23 13:54:55.674342 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:54:55.674262 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podStartSLOduration=6.674244565 podStartE2EDuration="6.674244565s" podCreationTimestamp="2026-04-23 13:54:49 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:54:55.672633123 +0000 UTC m=+1416.420635078" watchObservedRunningTime="2026-04-23 13:54:55.674244565 +0000 UTC m=+1416.422246521" Apr 23 13:55:00.112850 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:00.112809 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:55:00.113310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:00.112868 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:55:00.114360 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:00.114313 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:55:03.904776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:03.904730 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:55:03.928696 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:03.928652 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:55:10.113012 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:10.112922 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:55:13.904569 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:13.904511 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:55:13.927902 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:13.927858 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:55:20.112975 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:20.112927 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:55:23.905090 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:23.905037 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" probeResult="failure" output="Get \"https://10.134.0.49:8001/health\": dial tcp 10.134.0.49:8001: connect: connection refused" Apr 23 13:55:23.928033 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:23.927996 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:55:30.113122 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:30.113076 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:55:33.914276 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:33.914242 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:55:33.927817 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:33.927776 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:55:33.930840 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:33.930815 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:55:40.112748 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:40.112702 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:55:43.928863 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:43.928814 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" probeResult="failure" output="Get \"https://10.134.0.50:8000/health\": dial tcp 10.134.0.50:8000: connect: connection refused" Apr 23 13:55:50.112816 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:50.112767 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:55:53.943166 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:53.943139 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:55:53.950976 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:55:53.950953 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:56:00.113313 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:00.113273 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:56:10.113286 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:10.113244 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:56:10.773972 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:10.773938 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw"] Apr 23 13:56:10.774574 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:10.774545 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" containerID="cri-o://85d864dcb93fa4f906d7f1c90b676fc709183811798d43dbde86cf33a9d2a0ea" gracePeriod=30 Apr 23 13:56:10.777592 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:10.777563 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9"] Apr 23 13:56:10.777935 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:10.777882 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" containerID="cri-o://04037263a4247664328367f1c5b6b39a6bc995a3fa5258be45e5832e0405c04c" gracePeriod=30 Apr 23 13:56:19.864028 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:19.864002 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:56:19.865489 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:19.865470 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 13:56:20.113233 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:20.113198 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:56:27.839971 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.839928 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm"] Apr 23 13:56:27.845344 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.845311 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.848232 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.848210 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-dockercfg-wt7ns\"" Apr 23 13:56:27.849911 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.849891 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"custom-route-timeout-pd-test-kserve-self-signed-certs\"" Apr 23 13:56:27.868234 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.868211 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm"] Apr 23 13:56:27.893093 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.893067 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7"] Apr 23 13:56:27.901775 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.901745 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.920373 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.920351 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7"] Apr 23 13:56:27.972091 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972065 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.972227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972108 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.972227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972181 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.972227 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972213 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972401 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972235 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972401 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972252 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac64d62-c91b-4e21-8923-8839b361cfe0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972401 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972298 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.972401 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972389 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.972538 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972407 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972538 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972425 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a173f8ae-85cc-411e-94ff-2bbde014f54c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:27.972538 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972452 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-dshm\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972538 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972475 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-h7c6k\" (UniqueName: \"kubernetes.io/projected/3ac64d62-c91b-4e21-8923-8839b361cfe0-kube-api-access-h7c6k\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972660 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972585 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-home\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:27.972660 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:27.972624 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7r8sm\" (UniqueName: \"kubernetes.io/projected/a173f8ae-85cc-411e-94ff-2bbde014f54c-kube-api-access-7r8sm\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073059 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073033 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073067 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073085 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073101 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac64d62-c91b-4e21-8923-8839b361cfe0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073226 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073177 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073463 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073412 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073521 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073469 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073521 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073480 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073521 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073504 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-model-cache\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073680 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073517 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a173f8ae-85cc-411e-94ff-2bbde014f54c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073680 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073580 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-dshm\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073680 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073590 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-tmp-dir\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073680 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073634 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-h7c6k\" (UniqueName: \"kubernetes.io/projected/3ac64d62-c91b-4e21-8923-8839b361cfe0-kube-api-access-h7c6k\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073680 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073654 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-home\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073924 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073712 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-home\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073924 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073743 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7r8sm\" (UniqueName: \"kubernetes.io/projected/a173f8ae-85cc-411e-94ff-2bbde014f54c-kube-api-access-7r8sm\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073924 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073780 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.073924 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073804 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.073924 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.073835 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.074185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.074054 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-home\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.074185 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.074171 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-kserve-provision-location\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.074312 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.074292 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-model-cache\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.076063 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.076029 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-dshm\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.076171 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.076079 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac64d62-c91b-4e21-8923-8839b361cfe0-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.076231 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.076163 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a173f8ae-85cc-411e-94ff-2bbde014f54c-tls-certs\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.076288 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.076231 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-dshm\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.082892 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.082864 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7r8sm\" (UniqueName: \"kubernetes.io/projected/a173f8ae-85cc-411e-94ff-2bbde014f54c-kube-api-access-7r8sm\") pod \"custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.083111 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.083093 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-h7c6k\" (UniqueName: \"kubernetes.io/projected/3ac64d62-c91b-4e21-8923-8839b361cfe0-kube-api-access-h7c6k\") pod \"custom-route-timeout-pd-test-kserve-68777759dd-2rvhm\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.154638 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.154561 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.213243 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.213051 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:28.300255 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.300227 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm"] Apr 23 13:56:28.302880 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:56:28.302706 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod3ac64d62_c91b_4e21_8923_8839b361cfe0.slice/crio-b84c3c7a1276b527779b44c8bb212401dc2ce19828a1768459132b1e2c08128c WatchSource:0}: Error finding container b84c3c7a1276b527779b44c8bb212401dc2ce19828a1768459132b1e2c08128c: Status 404 returned error can't find the container with id b84c3c7a1276b527779b44c8bb212401dc2ce19828a1768459132b1e2c08128c Apr 23 13:56:28.362063 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.362034 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7"] Apr 23 13:56:28.368318 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:56:28.368294 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-poda173f8ae_85cc_411e_94ff_2bbde014f54c.slice/crio-8872be730ef3f0e18d6503c4430cbdbeb5c68d5e055f1d4d9aa52e28d9672c84 WatchSource:0}: Error finding container 8872be730ef3f0e18d6503c4430cbdbeb5c68d5e055f1d4d9aa52e28d9672c84: Status 404 returned error can't find the container with id 8872be730ef3f0e18d6503c4430cbdbeb5c68d5e055f1d4d9aa52e28d9672c84 Apr 23 13:56:28.993268 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.993232 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerStarted","Data":"af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a"} Apr 23 13:56:28.993268 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.993273 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerStarted","Data":"b84c3c7a1276b527779b44c8bb212401dc2ce19828a1768459132b1e2c08128c"} Apr 23 13:56:28.993806 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.993378 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:28.994818 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.994791 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" event={"ID":"a173f8ae-85cc-411e-94ff-2bbde014f54c","Type":"ContainerStarted","Data":"30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0"} Apr 23 13:56:28.994972 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:28.994824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" event={"ID":"a173f8ae-85cc-411e-94ff-2bbde014f54c","Type":"ContainerStarted","Data":"8872be730ef3f0e18d6503c4430cbdbeb5c68d5e055f1d4d9aa52e28d9672c84"} Apr 23 13:56:30.005488 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:30.005435 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerStarted","Data":"dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549"} Apr 23 13:56:30.113528 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:30.113479 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" probeResult="failure" output="Get \"https://10.134.0.51:8000/health\": dial tcp 10.134.0.51:8000: connect: connection refused" Apr 23 13:56:33.021799 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:33.021769 2576 generic.go:358] "Generic (PLEG): container finished" podID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerID="30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0" exitCode=0 Apr 23 13:56:33.022163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:33.021846 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" event={"ID":"a173f8ae-85cc-411e-94ff-2bbde014f54c","Type":"ContainerDied","Data":"30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0"} Apr 23 13:56:34.026407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:34.026376 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerID="dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549" exitCode=0 Apr 23 13:56:34.027007 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:34.026451 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerDied","Data":"dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549"} Apr 23 13:56:34.028190 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:34.028168 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" event={"ID":"a173f8ae-85cc-411e-94ff-2bbde014f54c","Type":"ContainerStarted","Data":"851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78"} Apr 23 13:56:34.071909 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:34.071866 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podStartSLOduration=7.071854089 podStartE2EDuration="7.071854089s" podCreationTimestamp="2026-04-23 13:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:56:34.06962808 +0000 UTC m=+1514.817630034" watchObservedRunningTime="2026-04-23 13:56:34.071854089 +0000 UTC m=+1514.819856043" Apr 23 13:56:35.033604 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:35.033563 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerStarted","Data":"ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b"} Apr 23 13:56:35.059125 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:35.059070 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podStartSLOduration=8.059051621 podStartE2EDuration="8.059051621s" podCreationTimestamp="2026-04-23 13:56:27 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:56:35.055828771 +0000 UTC m=+1515.803830726" watchObservedRunningTime="2026-04-23 13:56:35.059051621 +0000 UTC m=+1515.807053574" Apr 23 13:56:38.155441 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.155396 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:38.155909 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.155516 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:38.156943 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.156889 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:56:38.168372 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.168349 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:56:38.213671 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.213633 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:38.213806 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.213684 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:56:38.214980 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:38.214956 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:56:40.122395 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:40.122362 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:56:40.130172 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:40.130141 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:56:40.774568 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:40.774516 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="llm-d-routing-sidecar" containerID="cri-o://f9b6148bc90a75213b25438f7439356fedefb96c99216eb30adfa521c8b2dd83" gracePeriod=2 Apr 23 13:56:41.057636 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.057612 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw_588a69c5-5a12-4037-b03f-725fdbee0106/main/0.log" Apr 23 13:56:41.058762 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.058547 2576 generic.go:358] "Generic (PLEG): container finished" podID="588a69c5-5a12-4037-b03f-725fdbee0106" containerID="85d864dcb93fa4f906d7f1c90b676fc709183811798d43dbde86cf33a9d2a0ea" exitCode=137 Apr 23 13:56:41.058762 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.058571 2576 generic.go:358] "Generic (PLEG): container finished" podID="588a69c5-5a12-4037-b03f-725fdbee0106" containerID="f9b6148bc90a75213b25438f7439356fedefb96c99216eb30adfa521c8b2dd83" exitCode=0 Apr 23 13:56:41.058762 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.058702 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerDied","Data":"85d864dcb93fa4f906d7f1c90b676fc709183811798d43dbde86cf33a9d2a0ea"} Apr 23 13:56:41.058762 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.058733 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerDied","Data":"f9b6148bc90a75213b25438f7439356fedefb96c99216eb30adfa521c8b2dd83"} Apr 23 13:56:41.060579 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.060553 2576 generic.go:358] "Generic (PLEG): container finished" podID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerID="04037263a4247664328367f1c5b6b39a6bc995a3fa5258be45e5832e0405c04c" exitCode=137 Apr 23 13:56:41.060718 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.060622 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" event={"ID":"682e1451-a1f8-41f8-a224-9a8f7ff69531","Type":"ContainerDied","Data":"04037263a4247664328367f1c5b6b39a6bc995a3fa5258be45e5832e0405c04c"} Apr 23 13:56:41.129307 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.129280 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw_588a69c5-5a12-4037-b03f-725fdbee0106/main/0.log" Apr 23 13:56:41.130095 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.130075 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:56:41.146483 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.146466 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:56:41.214134 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214106 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-kqlsz\" (UniqueName: \"kubernetes.io/projected/588a69c5-5a12-4037-b03f-725fdbee0106-kube-api-access-kqlsz\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214153 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-home\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214186 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-dshm\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.214310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214214 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/682e1451-a1f8-41f8-a224-9a8f7ff69531-tls-certs\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.214310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214235 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-model-cache\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214264 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-home\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.214310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214290 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-tmp-dir\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214364 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-tmp-dir\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214397 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-kserve-provision-location\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214426 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-k8kdh\" (UniqueName: \"kubernetes.io/projected/682e1451-a1f8-41f8-a224-9a8f7ff69531-kube-api-access-k8kdh\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214469 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-dshm\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214505 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/588a69c5-5a12-4037-b03f-725fdbee0106-tls-certs\") pod \"588a69c5-5a12-4037-b03f-725fdbee0106\" (UID: \"588a69c5-5a12-4037-b03f-725fdbee0106\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214556 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-kserve-provision-location\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.214631 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214607 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-model-cache\") pod \"682e1451-a1f8-41f8-a224-9a8f7ff69531\" (UID: \"682e1451-a1f8-41f8-a224-9a8f7ff69531\") " Apr 23 13:56:41.215012 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214821 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-home" (OuterVolumeSpecName: "home") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.215012 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.214940 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.217447 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.217195 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-dshm" (OuterVolumeSpecName: "dshm") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.218235 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.217648 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-dshm" (OuterVolumeSpecName: "dshm") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.218235 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.218035 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/588a69c5-5a12-4037-b03f-725fdbee0106-kube-api-access-kqlsz" (OuterVolumeSpecName: "kube-api-access-kqlsz") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "kube-api-access-kqlsz". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:56:41.218559 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.218531 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-model-cache" (OuterVolumeSpecName: "model-cache") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.218703 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.218679 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-model-cache" (OuterVolumeSpecName: "model-cache") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.218899 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.218871 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-home" (OuterVolumeSpecName: "home") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.219961 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.219923 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/682e1451-a1f8-41f8-a224-9a8f7ff69531-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:56:41.219961 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.219926 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/682e1451-a1f8-41f8-a224-9a8f7ff69531-kube-api-access-k8kdh" (OuterVolumeSpecName: "kube-api-access-k8kdh") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "kube-api-access-k8kdh". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:56:41.220318 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.220283 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/588a69c5-5a12-4037-b03f-725fdbee0106-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:56:41.227936 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.227913 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.231170 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.231143 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.244716 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.244693 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "588a69c5-5a12-4037-b03f-725fdbee0106" (UID: "588a69c5-5a12-4037-b03f-725fdbee0106"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.246017 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.245985 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "682e1451-a1f8-41f8-a224-9a8f7ff69531" (UID: "682e1451-a1f8-41f8-a224-9a8f7ff69531"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315717 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315744 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315755 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-k8kdh\" (UniqueName: \"kubernetes.io/projected/682e1451-a1f8-41f8-a224-9a8f7ff69531-kube-api-access-k8kdh\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315765 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315775 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/588a69c5-5a12-4037-b03f-725fdbee0106-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315783 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315791 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.315800 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315800 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-kqlsz\" (UniqueName: \"kubernetes.io/projected/588a69c5-5a12-4037-b03f-725fdbee0106-kube-api-access-kqlsz\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.316183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315809 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.316183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315818 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/682e1451-a1f8-41f8-a224-9a8f7ff69531-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.316183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315826 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.316183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315833 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/682e1451-a1f8-41f8-a224-9a8f7ff69531-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:41.316183 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:41.315842 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/588a69c5-5a12-4037-b03f-725fdbee0106-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:56:42.065536 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.065451 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw_588a69c5-5a12-4037-b03f-725fdbee0106/main/0.log" Apr 23 13:56:42.066342 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.066302 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" Apr 23 13:56:42.066543 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.066296 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw" event={"ID":"588a69c5-5a12-4037-b03f-725fdbee0106","Type":"ContainerDied","Data":"fa308f4f04efb19cf47008aed9dbb682f9e04c72a7068b9f49c564c418f35988"} Apr 23 13:56:42.066543 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.066475 2576 scope.go:117] "RemoveContainer" containerID="85d864dcb93fa4f906d7f1c90b676fc709183811798d43dbde86cf33a9d2a0ea" Apr 23 13:56:42.068666 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.068633 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" event={"ID":"682e1451-a1f8-41f8-a224-9a8f7ff69531","Type":"ContainerDied","Data":"1a1f19808b22030a73b667975194e15a246c09e8ff4bc464b4ed2a15214a78be"} Apr 23 13:56:42.068964 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.068942 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9" Apr 23 13:56:42.077826 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.077799 2576 scope.go:117] "RemoveContainer" containerID="b9c3ba5e5705473b001c1214f565be80bdd7e7b2a7ba252aa2424be6432e1505" Apr 23 13:56:42.090252 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.090226 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9"] Apr 23 13:56:42.095622 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.095594 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-prefill-54t84d9"] Apr 23 13:56:42.104345 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.104308 2576 scope.go:117] "RemoveContainer" containerID="f9b6148bc90a75213b25438f7439356fedefb96c99216eb30adfa521c8b2dd83" Apr 23 13:56:42.113205 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.110323 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw"] Apr 23 13:56:42.113205 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.113148 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-05aa9bba-kserve-5477c998497f5hw"] Apr 23 13:56:42.114358 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.114319 2576 scope.go:117] "RemoveContainer" containerID="04037263a4247664328367f1c5b6b39a6bc995a3fa5258be45e5832e0405c04c" Apr 23 13:56:42.123546 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:42.123519 2576 scope.go:117] "RemoveContainer" containerID="1767b8b2e0adf7d30763e27d65fe2b72f2c394929379dc5aec0191188837166c" Apr 23 13:56:43.826418 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:43.826381 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" path="/var/lib/kubelet/pods/588a69c5-5a12-4037-b03f-725fdbee0106/volumes" Apr 23 13:56:43.827045 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:43.827022 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" path="/var/lib/kubelet/pods/682e1451-a1f8-41f8-a224-9a8f7ff69531/volumes" Apr 23 13:56:45.066357 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:45.064342 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db"] Apr 23 13:56:45.066357 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:45.064750 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" containerID="cri-o://0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9" gracePeriod=30 Apr 23 13:56:48.155461 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:48.155415 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:56:48.213697 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:48.213657 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:56:53.748435 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.748394 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 13:56:53.748968 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.748950 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="llm-d-routing-sidecar" Apr 23 13:56:53.749013 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.748973 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="llm-d-routing-sidecar" Apr 23 13:56:53.749013 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.748993 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="storage-initializer" Apr 23 13:56:53.749013 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749002 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="storage-initializer" Apr 23 13:56:53.749103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749014 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="storage-initializer" Apr 23 13:56:53.749103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749023 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="storage-initializer" Apr 23 13:56:53.749103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749040 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" Apr 23 13:56:53.749103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749048 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" Apr 23 13:56:53.749103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749058 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" Apr 23 13:56:53.749103 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749066 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" Apr 23 13:56:53.749282 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749156 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="682e1451-a1f8-41f8-a224-9a8f7ff69531" containerName="main" Apr 23 13:56:53.749282 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749168 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="llm-d-routing-sidecar" Apr 23 13:56:53.749282 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.749180 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="588a69c5-5a12-4037-b03f-725fdbee0106" containerName="main" Apr 23 13:56:53.778830 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.778796 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 13:56:53.778996 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.778932 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.781799 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.781772 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisvc-model-fb-opt-125m-route-f312f5-cb7fb8cf-dockercfg-8kjw9\"" Apr 23 13:56:53.781945 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.781801 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"kserve-ci-e2e-test\"/\"llmisv4e643bc258191ffc517a31cd1d0ddd27-kserve-self-signed-certs\"" Apr 23 13:56:53.840428 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840396 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.840592 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840434 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.840592 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840518 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-7xr6w\" (UniqueName: \"kubernetes.io/projected/00b3535e-4730-4095-a7b8-1cbe82f0873a-kube-api-access-7xr6w\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.840722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840586 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.840722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840621 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.840722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840650 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.840722 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.840688 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00b3535e-4730-4095-a7b8-1cbe82f0873a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941445 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941410 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941487 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-7xr6w\" (UniqueName: \"kubernetes.io/projected/00b3535e-4730-4095-a7b8-1cbe82f0873a-kube-api-access-7xr6w\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941513 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941536 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941617 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941575 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00b3535e-4730-4095-a7b8-1cbe82f0873a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941842 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941656 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.941842 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.941803 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-tmp-dir\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.942072 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.942049 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-kserve-provision-location\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.942167 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.942145 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-model-cache\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.942167 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.942155 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-home\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.944028 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.944003 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-dshm\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.944149 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.944125 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00b3535e-4730-4095-a7b8-1cbe82f0873a-tls-certs\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:53.951583 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:53.951561 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-7xr6w\" (UniqueName: \"kubernetes.io/projected/00b3535e-4730-4095-a7b8-1cbe82f0873a-kube-api-access-7xr6w\") pod \"llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:54.090202 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:54.090168 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:56:54.428098 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:54.428058 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 13:56:54.429534 ip-10-0-135-229 kubenswrapper[2576]: W0423 13:56:54.429507 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-burstable.slice/kubepods-burstable-pod00b3535e_4730_4095_a7b8_1cbe82f0873a.slice/crio-ec299970e04320717ac31a04ae0f82b409bba7b9a42d0001440a861e3ad11d34 WatchSource:0}: Error finding container ec299970e04320717ac31a04ae0f82b409bba7b9a42d0001440a861e3ad11d34: Status 404 returned error can't find the container with id ec299970e04320717ac31a04ae0f82b409bba7b9a42d0001440a861e3ad11d34 Apr 23 13:56:55.139423 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:55.139384 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"00b3535e-4730-4095-a7b8-1cbe82f0873a","Type":"ContainerStarted","Data":"62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07"} Apr 23 13:56:55.139423 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:55.139423 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"00b3535e-4730-4095-a7b8-1cbe82f0873a","Type":"ContainerStarted","Data":"ec299970e04320717ac31a04ae0f82b409bba7b9a42d0001440a861e3ad11d34"} Apr 23 13:56:58.156139 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:58.156004 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:56:58.213732 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:58.213692 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:56:59.157809 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:59.157771 2576 generic.go:358] "Generic (PLEG): container finished" podID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerID="62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07" exitCode=0 Apr 23 13:56:59.158324 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:56:59.157836 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"00b3535e-4730-4095-a7b8-1cbe82f0873a","Type":"ContainerDied","Data":"62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07"} Apr 23 13:57:00.163882 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:00.163844 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"00b3535e-4730-4095-a7b8-1cbe82f0873a","Type":"ContainerStarted","Data":"70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a"} Apr 23 13:57:00.188407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:00.188358 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podStartSLOduration=7.188325392 podStartE2EDuration="7.188325392s" podCreationTimestamp="2026-04-23 13:56:53 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 13:57:00.185114953 +0000 UTC m=+1540.933116907" watchObservedRunningTime="2026-04-23 13:57:00.188325392 +0000 UTC m=+1540.936327346" Apr 23 13:57:04.090719 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:04.090669 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="unhealthy" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:57:04.092512 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:04.092481 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:57:08.155776 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:08.155729 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:57:08.214285 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:08.214243 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:57:14.091173 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:14.091126 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:57:15.458774 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.458753 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db_641ddd3a-17b2-4dc1-a420-576fa2a331b7/main/0.log" Apr 23 13:57:15.459186 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.459169 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:57:15.465618 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465585 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-dshm\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.465703 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465641 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-home\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.465703 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465686 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-model-cache\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.465773 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465731 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tmp-dir\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.465807 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465772 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lsm5r\" (UniqueName: \"kubernetes.io/projected/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kube-api-access-lsm5r\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.465844 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465809 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kserve-provision-location\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.465877 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.465866 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tls-certs\") pod \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\" (UID: \"641ddd3a-17b2-4dc1-a420-576fa2a331b7\") " Apr 23 13:57:15.467755 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.466142 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-model-cache" (OuterVolumeSpecName: "model-cache") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:57:15.468222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.468086 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-dshm" (OuterVolumeSpecName: "dshm") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:57:15.468222 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.468120 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-home" (OuterVolumeSpecName: "home") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:57:15.468850 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.468636 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:57:15.470026 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.469984 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kube-api-access-lsm5r" (OuterVolumeSpecName: "kube-api-access-lsm5r") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "kube-api-access-lsm5r". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:57:15.486946 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.486912 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:57:15.493353 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.493302 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "641ddd3a-17b2-4dc1-a420-576fa2a331b7" (UID: "641ddd3a-17b2-4dc1-a420-576fa2a331b7"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:57:15.567777 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567740 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:15.567777 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567780 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:15.568027 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567794 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lsm5r\" (UniqueName: \"kubernetes.io/projected/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kube-api-access-lsm5r\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:15.568027 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567809 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:15.568027 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567823 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/641ddd3a-17b2-4dc1-a420-576fa2a331b7-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:15.568027 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567836 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:15.568027 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:15.567848 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/641ddd3a-17b2-4dc1-a420-576fa2a331b7-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:57:16.238802 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.238721 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db_641ddd3a-17b2-4dc1-a420-576fa2a331b7/main/0.log" Apr 23 13:57:16.239132 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.239105 2576 generic.go:358] "Generic (PLEG): container finished" podID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerID="0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9" exitCode=137 Apr 23 13:57:16.239246 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.239184 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" event={"ID":"641ddd3a-17b2-4dc1-a420-576fa2a331b7","Type":"ContainerDied","Data":"0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9"} Apr 23 13:57:16.239246 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.239198 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" Apr 23 13:57:16.239246 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.239219 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db" event={"ID":"641ddd3a-17b2-4dc1-a420-576fa2a331b7","Type":"ContainerDied","Data":"f1e4549022cf48c6bed5c1056e44d1d9395e09367ed46a908f56de21946cf632"} Apr 23 13:57:16.239246 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.239239 2576 scope.go:117] "RemoveContainer" containerID="0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9" Apr 23 13:57:16.250446 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.250410 2576 scope.go:117] "RemoveContainer" containerID="87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96" Apr 23 13:57:16.261869 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.261844 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db"] Apr 23 13:57:16.264834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.264812 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-02ac88e7-kserve-bbf7cb498-k84db"] Apr 23 13:57:16.288207 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.288180 2576 scope.go:117] "RemoveContainer" containerID="0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9" Apr 23 13:57:16.288540 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:57:16.288520 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9\": container with ID starting with 0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9 not found: ID does not exist" containerID="0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9" Apr 23 13:57:16.288629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.288552 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9"} err="failed to get container status \"0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9\": rpc error: code = NotFound desc = could not find container \"0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9\": container with ID starting with 0c5d4517a881395309b7ac57ed2db37e45de64d89258e4690915d02f5b91f5e9 not found: ID does not exist" Apr 23 13:57:16.288629 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.288572 2576 scope.go:117] "RemoveContainer" containerID="87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96" Apr 23 13:57:16.288877 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:57:16.288860 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96\": container with ID starting with 87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96 not found: ID does not exist" containerID="87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96" Apr 23 13:57:16.288877 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:16.288880 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96"} err="failed to get container status \"87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96\": rpc error: code = NotFound desc = could not find container \"87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96\": container with ID starting with 87904b8ecce0c9a4ad82a8878e6af7323db0481833caf8eedb970a05b88ada96 not found: ID does not exist" Apr 23 13:57:17.827082 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:17.827042 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" path="/var/lib/kubelet/pods/641ddd3a-17b2-4dc1-a420-576fa2a331b7/volumes" Apr 23 13:57:18.155503 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:18.155391 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:57:18.213559 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:18.213506 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:57:24.090619 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:24.090559 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:57:24.091374 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:24.091020 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:57:28.155324 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:28.155269 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:57:28.213930 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:28.213886 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:57:34.091471 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:34.091419 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:57:38.155302 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:38.155257 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:57:38.213655 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:38.213618 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:57:44.090548 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:44.090501 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:57:48.155458 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:48.155398 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:57:48.213693 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:48.213656 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:57:54.090735 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:54.090691 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:57:58.155844 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:58.155790 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:57:58.214343 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:57:58.214294 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:58:04.090825 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:04.090786 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:58:08.155837 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:08.155506 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:58:08.214021 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:08.213968 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:58:14.090698 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:14.090651 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:58:18.154983 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:18.154931 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:58:18.213659 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:18.213615 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:58:24.091177 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:24.091134 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:58:28.155298 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:28.155242 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:58:28.213786 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:28.213732 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:58:34.091304 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:34.091260 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:58:38.155468 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:38.155411 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:58:38.214407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:38.214361 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:58:44.091310 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:44.091270 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:58:48.155202 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:48.155154 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:58:48.213917 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:48.213866 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:58:54.091154 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:54.091102 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:58:58.155407 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:58.155363 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" probeResult="failure" output="Get \"https://10.134.0.52:8001/health\": dial tcp 10.134.0.52:8001: connect: connection refused" Apr 23 13:58:58.213674 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:58:58.213635 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" probeResult="failure" output="Get \"https://10.134.0.53:8000/health\": dial tcp 10.134.0.53:8000: connect: connection refused" Apr 23 13:59:04.091046 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:04.091003 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:59:08.165432 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:08.165398 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:59:08.176944 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:08.176921 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:59:08.224544 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:08.224515 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:59:08.234058 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:08.234028 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:59:14.091381 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:14.091310 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:59:24.091269 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:24.091213 2576 prober.go:120] "Probe failed" probeType="Startup" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" probeResult="failure" output="Get \"https://10.134.0.54:8000/health\": dial tcp 10.134.0.54:8000: connect: connection refused" Apr 23 13:59:25.655313 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:25.655276 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7"] Apr 23 13:59:25.655931 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:25.655702 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" containerID="cri-o://851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78" gracePeriod=30 Apr 23 13:59:25.657425 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:25.657392 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm"] Apr 23 13:59:25.657744 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:25.657708 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" containerID="cri-o://ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b" gracePeriod=30 Apr 23 13:59:34.100399 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:34.100368 2576 kubelet.go:2658] "SyncLoop (probe)" probe="startup" status="started" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:59:34.107510 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:34.107489 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:59:51.831292 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:51.831218 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 13:59:51.831691 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:51.831572 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" containerID="cri-o://70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a" gracePeriod=30 Apr 23 13:59:52.573829 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.573807 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:59:52.739835 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739749 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-dshm\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739841 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00b3535e-4730-4095-a7b8-1cbe82f0873a-tls-certs\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739879 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-tmp-dir\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739907 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7xr6w\" (UniqueName: \"kubernetes.io/projected/00b3535e-4730-4095-a7b8-1cbe82f0873a-kube-api-access-7xr6w\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739937 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-kserve-provision-location\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739966 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-home\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740005 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.739995 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-model-cache\") pod \"00b3535e-4730-4095-a7b8-1cbe82f0873a\" (UID: \"00b3535e-4730-4095-a7b8-1cbe82f0873a\") " Apr 23 13:59:52.740484 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.740456 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-model-cache" (OuterVolumeSpecName: "model-cache") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:52.740844 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.740818 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-home" (OuterVolumeSpecName: "home") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:52.742135 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.742107 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/00b3535e-4730-4095-a7b8-1cbe82f0873a-kube-api-access-7xr6w" (OuterVolumeSpecName: "kube-api-access-7xr6w") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "kube-api-access-7xr6w". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:52.742308 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.742284 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-dshm" (OuterVolumeSpecName: "dshm") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:52.742409 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.742317 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/00b3535e-4730-4095-a7b8-1cbe82f0873a-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:52.758813 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.758784 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:52.793697 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.793668 2576 generic.go:358] "Generic (PLEG): container finished" podID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerID="70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a" exitCode=0 Apr 23 13:59:52.793831 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.793736 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"00b3535e-4730-4095-a7b8-1cbe82f0873a","Type":"ContainerDied","Data":"70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a"} Apr 23 13:59:52.793831 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.793768 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" event={"ID":"00b3535e-4730-4095-a7b8-1cbe82f0873a","Type":"ContainerDied","Data":"ec299970e04320717ac31a04ae0f82b409bba7b9a42d0001440a861e3ad11d34"} Apr 23 13:59:52.793831 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.793783 2576 scope.go:117] "RemoveContainer" containerID="70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a" Apr 23 13:59:52.793831 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.793783 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0" Apr 23 13:59:52.798934 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.798911 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "00b3535e-4730-4095-a7b8-1cbe82f0873a" (UID: "00b3535e-4730-4095-a7b8-1cbe82f0873a"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:52.801926 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.801907 2576 scope.go:117] "RemoveContainer" containerID="62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07" Apr 23 13:59:52.841515 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841491 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.841515 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841512 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/00b3535e-4730-4095-a7b8-1cbe82f0873a-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.841515 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841521 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.841905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841530 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7xr6w\" (UniqueName: \"kubernetes.io/projected/00b3535e-4730-4095-a7b8-1cbe82f0873a-kube-api-access-7xr6w\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.841905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841539 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.841905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841549 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.841905 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.841557 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/00b3535e-4730-4095-a7b8-1cbe82f0873a-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:52.862281 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.862152 2576 scope.go:117] "RemoveContainer" containerID="70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a" Apr 23 13:59:52.862485 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:52.862468 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a\": container with ID starting with 70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a not found: ID does not exist" containerID="70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a" Apr 23 13:59:52.862545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.862495 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a"} err="failed to get container status \"70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a\": rpc error: code = NotFound desc = could not find container \"70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a\": container with ID starting with 70635d0819884399425a17c88f7a428e60f25a774344b9cb3b311819f7d78d2a not found: ID does not exist" Apr 23 13:59:52.862545 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.862514 2576 scope.go:117] "RemoveContainer" containerID="62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07" Apr 23 13:59:52.862794 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:52.862778 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07\": container with ID starting with 62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07 not found: ID does not exist" containerID="62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07" Apr 23 13:59:52.862836 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:52.862800 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07"} err="failed to get container status \"62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07\": rpc error: code = NotFound desc = could not find container \"62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07\": container with ID starting with 62c97f697cf3773e08bf7986082607281d5cc7cd9d94f2adde8e991583110f07 not found: ID does not exist" Apr 23 13:59:53.116365 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:53.116317 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 13:59:53.119784 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:53.119757 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/llmisvc-model-fb-opt-125m-route-f312f5ec-kserve-mn-0"] Apr 23 13:59:53.825684 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:53.825656 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" path="/var/lib/kubelet/pods/00b3535e-4730-4095-a7b8-1cbe82f0873a/volumes" Apr 23 13:59:55.657957 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.657907 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="llm-d-routing-sidecar" containerID="cri-o://af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a" gracePeriod=2 Apr 23 13:59:55.807295 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.807275 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-68777759dd-2rvhm_3ac64d62-c91b-4e21-8923-8839b361cfe0/main/0.log" Apr 23 13:59:55.807877 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.807853 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerID="af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a" exitCode=0 Apr 23 13:59:55.807941 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.807927 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerDied","Data":"af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a"} Apr 23 13:59:55.946411 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.946390 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-68777759dd-2rvhm_3ac64d62-c91b-4e21-8923-8839b361cfe0/main/0.log" Apr 23 13:59:55.947171 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.947152 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:59:55.966013 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:55.965995 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:59:56.069139 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069113 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-kserve-provision-location\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069159 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-dshm\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.069309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069178 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-7r8sm\" (UniqueName: \"kubernetes.io/projected/a173f8ae-85cc-411e-94ff-2bbde014f54c-kube-api-access-7r8sm\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.069309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069202 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-model-cache\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.069309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069223 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-tmp-dir\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.069564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069315 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-dshm\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069399 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-tmp-dir\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069428 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-home\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.069564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069491 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-h7c6k\" (UniqueName: \"kubernetes.io/projected/3ac64d62-c91b-4e21-8923-8839b361cfe0-kube-api-access-h7c6k\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069515 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-model-cache" (OuterVolumeSpecName: "model-cache") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.069564 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069525 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac64d62-c91b-4e21-8923-8839b361cfe0-tls-certs\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069580 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-model-cache\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069611 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-kserve-provision-location\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.069881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069692 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-home\") pod \"3ac64d62-c91b-4e21-8923-8839b361cfe0\" (UID: \"3ac64d62-c91b-4e21-8923-8839b361cfe0\") " Apr 23 13:59:56.069881 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.069744 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a173f8ae-85cc-411e-94ff-2bbde014f54c-tls-certs\") pod \"a173f8ae-85cc-411e-94ff-2bbde014f54c\" (UID: \"a173f8ae-85cc-411e-94ff-2bbde014f54c\") " Apr 23 13:59:56.070293 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.070111 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.070942 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.070895 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-model-cache" (OuterVolumeSpecName: "model-cache") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "model-cache". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.073053 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.072095 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-dshm" (OuterVolumeSpecName: "dshm") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.073053 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.072201 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/a173f8ae-85cc-411e-94ff-2bbde014f54c-kube-api-access-7r8sm" (OuterVolumeSpecName: "kube-api-access-7r8sm") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "kube-api-access-7r8sm". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:56.073053 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.072374 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-home" (OuterVolumeSpecName: "home") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.073053 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.072656 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-home" (OuterVolumeSpecName: "home") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "home". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.073053 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.073019 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-dshm" (OuterVolumeSpecName: "dshm") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "dshm". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.073389 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.073097 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/3ac64d62-c91b-4e21-8923-8839b361cfe0-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:56.073919 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.073887 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/secret/a173f8ae-85cc-411e-94ff-2bbde014f54c-tls-certs" (OuterVolumeSpecName: "tls-certs") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "tls-certs". PluginName "kubernetes.io/secret", VolumeGIDValue "" Apr 23 13:59:56.074953 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.074927 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3ac64d62-c91b-4e21-8923-8839b361cfe0-kube-api-access-h7c6k" (OuterVolumeSpecName: "kube-api-access-h7c6k") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "kube-api-access-h7c6k". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 13:59:56.086036 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.086014 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.090157 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.090139 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-tmp-dir" (OuterVolumeSpecName: "tmp-dir") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "tmp-dir". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.102781 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.102758 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "3ac64d62-c91b-4e21-8923-8839b361cfe0" (UID: "3ac64d62-c91b-4e21-8923-8839b361cfe0"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.133212 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.133188 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-kserve-provision-location" (OuterVolumeSpecName: "kserve-provision-location") pod "a173f8ae-85cc-411e-94ff-2bbde014f54c" (UID: "a173f8ae-85cc-411e-94ff-2bbde014f54c"). InnerVolumeSpecName "kserve-provision-location". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 13:59:56.170678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170625 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170645 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170655 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-h7c6k\" (UniqueName: \"kubernetes.io/projected/3ac64d62-c91b-4e21-8923-8839b361cfe0-kube-api-access-h7c6k\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170664 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/3ac64d62-c91b-4e21-8923-8839b361cfe0-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170678 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170672 2576 reconciler_common.go:299] "Volume detached for volume \"model-cache\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-model-cache\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170681 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170689 2576 reconciler_common.go:299] "Volume detached for volume \"home\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-home\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170697 2576 reconciler_common.go:299] "Volume detached for volume \"tls-certs\" (UniqueName: \"kubernetes.io/secret/a173f8ae-85cc-411e-94ff-2bbde014f54c-tls-certs\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170705 2576 reconciler_common.go:299] "Volume detached for volume \"kserve-provision-location\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-kserve-provision-location\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170714 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170722 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-7r8sm\" (UniqueName: \"kubernetes.io/projected/a173f8ae-85cc-411e-94ff-2bbde014f54c-kube-api-access-7r8sm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170730 2576 reconciler_common.go:299] "Volume detached for volume \"tmp-dir\" (UniqueName: \"kubernetes.io/empty-dir/a173f8ae-85cc-411e-94ff-2bbde014f54c-tmp-dir\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.170887 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.170739 2576 reconciler_common.go:299] "Volume detached for volume \"dshm\" (UniqueName: \"kubernetes.io/empty-dir/3ac64d62-c91b-4e21-8923-8839b361cfe0-dshm\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 13:59:56.812483 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.812456 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve-ci-e2e-test_custom-route-timeout-pd-test-kserve-68777759dd-2rvhm_3ac64d62-c91b-4e21-8923-8839b361cfe0/main/0.log" Apr 23 13:59:56.813092 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.813070 2576 generic.go:358] "Generic (PLEG): container finished" podID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerID="ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b" exitCode=137 Apr 23 13:59:56.813195 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.813141 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerDied","Data":"ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b"} Apr 23 13:59:56.813195 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.813155 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" Apr 23 13:59:56.813195 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.813171 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm" event={"ID":"3ac64d62-c91b-4e21-8923-8839b361cfe0","Type":"ContainerDied","Data":"b84c3c7a1276b527779b44c8bb212401dc2ce19828a1768459132b1e2c08128c"} Apr 23 13:59:56.813195 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.813192 2576 scope.go:117] "RemoveContainer" containerID="ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b" Apr 23 13:59:56.814728 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.814706 2576 generic.go:358] "Generic (PLEG): container finished" podID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerID="851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78" exitCode=137 Apr 23 13:59:56.814834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.814780 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" Apr 23 13:59:56.814834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.814789 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" event={"ID":"a173f8ae-85cc-411e-94ff-2bbde014f54c","Type":"ContainerDied","Data":"851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78"} Apr 23 13:59:56.814834 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.814824 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7" event={"ID":"a173f8ae-85cc-411e-94ff-2bbde014f54c","Type":"ContainerDied","Data":"8872be730ef3f0e18d6503c4430cbdbeb5c68d5e055f1d4d9aa52e28d9672c84"} Apr 23 13:59:56.822891 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.822875 2576 scope.go:117] "RemoveContainer" containerID="dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549" Apr 23 13:59:56.835798 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.835782 2576 scope.go:117] "RemoveContainer" containerID="af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a" Apr 23 13:59:56.837283 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.837260 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm"] Apr 23 13:59:56.839601 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.839580 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-68777759dd-2rvhm"] Apr 23 13:59:56.843493 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.843478 2576 scope.go:117] "RemoveContainer" containerID="ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b" Apr 23 13:59:56.843721 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:56.843704 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b\": container with ID starting with ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b not found: ID does not exist" containerID="ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b" Apr 23 13:59:56.843782 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.843729 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b"} err="failed to get container status \"ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b\": rpc error: code = NotFound desc = could not find container \"ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b\": container with ID starting with ba5d59a892319b744486869bccb910e64943a776c893666474e425dfec7f7e8b not found: ID does not exist" Apr 23 13:59:56.843782 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.843746 2576 scope.go:117] "RemoveContainer" containerID="dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549" Apr 23 13:59:56.843985 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:56.843967 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549\": container with ID starting with dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549 not found: ID does not exist" containerID="dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549" Apr 23 13:59:56.844043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.843991 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549"} err="failed to get container status \"dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549\": rpc error: code = NotFound desc = could not find container \"dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549\": container with ID starting with dca17942fc6c43269fb350d5643570271ea14d6372dc82b24e848da46c689549 not found: ID does not exist" Apr 23 13:59:56.844043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.844021 2576 scope.go:117] "RemoveContainer" containerID="af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a" Apr 23 13:59:56.844230 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:56.844214 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a\": container with ID starting with af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a not found: ID does not exist" containerID="af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a" Apr 23 13:59:56.844291 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.844238 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a"} err="failed to get container status \"af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a\": rpc error: code = NotFound desc = could not find container \"af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a\": container with ID starting with af7e2d69651f806056e25a9127f0b49c53a48b4e12b866981a162b725272299a not found: ID does not exist" Apr 23 13:59:56.844291 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.844254 2576 scope.go:117] "RemoveContainer" containerID="851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78" Apr 23 13:59:56.851163 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.851141 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7"] Apr 23 13:59:56.851607 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.851588 2576 scope.go:117] "RemoveContainer" containerID="30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0" Apr 23 13:59:56.859454 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.859434 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["kserve-ci-e2e-test/custom-route-timeout-pd-test-kserve-prefill-84775ffd7f-ljgd7"] Apr 23 13:59:56.861309 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.861292 2576 scope.go:117] "RemoveContainer" containerID="851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78" Apr 23 13:59:56.861680 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:56.861661 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78\": container with ID starting with 851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78 not found: ID does not exist" containerID="851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78" Apr 23 13:59:56.861749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.861686 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78"} err="failed to get container status \"851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78\": rpc error: code = NotFound desc = could not find container \"851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78\": container with ID starting with 851d41d53e82d98ff7f96c30fed6b00adaf2bef179a2a817d21531aa00c82c78 not found: ID does not exist" Apr 23 13:59:56.861749 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.861711 2576 scope.go:117] "RemoveContainer" containerID="30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0" Apr 23 13:59:56.861989 ip-10-0-135-229 kubenswrapper[2576]: E0423 13:59:56.861964 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0\": container with ID starting with 30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0 not found: ID does not exist" containerID="30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0" Apr 23 13:59:56.862043 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:56.861994 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0"} err="failed to get container status \"30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0\": rpc error: code = NotFound desc = could not find container \"30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0\": container with ID starting with 30c9e4523b9b50d07e8f47ae4d1b653d0a8ba6e041654266cfe0e13fc91c7bd0 not found: ID does not exist" Apr 23 13:59:57.825168 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:57.825132 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" path="/var/lib/kubelet/pods/3ac64d62-c91b-4e21-8923-8839b361cfe0/volumes" Apr 23 13:59:57.825613 ip-10-0-135-229 kubenswrapper[2576]: I0423 13:59:57.825600 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" path="/var/lib/kubelet/pods/a173f8ae-85cc-411e-94ff-2bbde014f54c/volumes" Apr 23 14:01:19.886715 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:01:19.886690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 14:01:19.888903 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:01:19.888882 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 14:02:15.659141 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659107 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-xtcmf/must-gather-v5cqz"] Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659469 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659481 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659493 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659499 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659506 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="llm-d-routing-sidecar" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659513 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="llm-d-routing-sidecar" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659519 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659523 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659533 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659538 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659544 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659549 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659573 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659579 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659595 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659600 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659607 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659612 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="storage-initializer" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659663 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659671 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3ac64d62-c91b-4e21-8923-8839b361cfe0" containerName="llm-d-routing-sidecar" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659679 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="a173f8ae-85cc-411e-94ff-2bbde014f54c" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659685 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="641ddd3a-17b2-4dc1-a420-576fa2a331b7" containerName="main" Apr 23 14:02:15.661726 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.659691 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="00b3535e-4730-4095-a7b8-1cbe82f0873a" containerName="main" Apr 23 14:02:15.663891 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.663866 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.666652 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.666630 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtcmf\"/\"openshift-service-ca.crt\"" Apr 23 14:02:15.666652 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.666648 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-xtcmf\"/\"kube-root-ca.crt\"" Apr 23 14:02:15.667882 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.667863 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-xtcmf\"/\"default-dockercfg-5tcrv\"" Apr 23 14:02:15.671820 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.671797 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtcmf/must-gather-v5cqz"] Apr 23 14:02:15.775709 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.775687 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-must-gather-output\") pod \"must-gather-v5cqz\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.775839 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.775738 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-kube-api-access-lkc4z\") pod \"must-gather-v5cqz\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.876143 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.876118 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-must-gather-output\") pod \"must-gather-v5cqz\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.876259 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.876159 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-kube-api-access-lkc4z\") pod \"must-gather-v5cqz\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.876481 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.876461 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-must-gather-output\") pod \"must-gather-v5cqz\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.885105 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.885084 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-kube-api-access-lkc4z\") pod \"must-gather-v5cqz\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:15.973767 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:15.973718 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:16.092648 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:16.092497 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-xtcmf/must-gather-v5cqz"] Apr 23 14:02:16.094761 ip-10-0-135-229 kubenswrapper[2576]: W0423 14:02:16.094731 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-besteffort.slice/kubepods-besteffort-pod3c79e9f6_d13d_4edd_8ee1_565c3b17e5bb.slice/crio-2f31ac0ef7df279c4f4d0b2dec06dd57bae81cddd69e7ebf43395776e7fb4186 WatchSource:0}: Error finding container 2f31ac0ef7df279c4f4d0b2dec06dd57bae81cddd69e7ebf43395776e7fb4186: Status 404 returned error can't find the container with id 2f31ac0ef7df279c4f4d0b2dec06dd57bae81cddd69e7ebf43395776e7fb4186 Apr 23 14:02:16.096372 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:16.096355 2576 provider.go:93] Refreshing cache for provider: *credentialprovider.defaultDockerConfigProvider Apr 23 14:02:16.275176 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:16.275109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" event={"ID":"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb","Type":"ContainerStarted","Data":"2f31ac0ef7df279c4f4d0b2dec06dd57bae81cddd69e7ebf43395776e7fb4186"} Apr 23 14:02:21.298151 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:21.298109 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" event={"ID":"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb","Type":"ContainerStarted","Data":"27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79"} Apr 23 14:02:21.298151 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:21.298152 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" event={"ID":"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb","Type":"ContainerStarted","Data":"ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882"} Apr 23 14:02:21.318425 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:21.318371 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" podStartSLOduration=1.7919525969999999 podStartE2EDuration="6.318355187s" podCreationTimestamp="2026-04-23 14:02:15 +0000 UTC" firstStartedPulling="2026-04-23 14:02:16.096612153 +0000 UTC m=+1856.844614096" lastFinishedPulling="2026-04-23 14:02:20.623014753 +0000 UTC m=+1861.371016686" observedRunningTime="2026-04-23 14:02:21.316455998 +0000 UTC m=+1862.064457952" watchObservedRunningTime="2026-04-23 14:02:21.318355187 +0000 UTC m=+1862.066357143" Apr 23 14:02:44.363497 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:44.363421 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cf45c9b8b-qfgth_de51ff65-d9f2-40df-b830-2cd1d95fe71e/router/0.log" Apr 23 14:02:45.248118 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:45.248066 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cf45c9b8b-qfgth_de51ff65-d9f2-40df-b830-2cd1d95fe71e/router/0.log" Apr 23 14:02:46.029799 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:46.029760 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-gkm9l_8436e6df-a753-453a-8e1c-829c2637e784/authorino/0.log" Apr 23 14:02:46.073473 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:46.073450 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-shkqn_a5ddb56b-5f4c-4c85-a105-94515bb001c1/kuadrant-console-plugin/0.log" Apr 23 14:02:47.397266 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:47.397228 2576 generic.go:358] "Generic (PLEG): container finished" podID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerID="ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882" exitCode=0 Apr 23 14:02:47.397881 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:47.397301 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" event={"ID":"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb","Type":"ContainerDied","Data":"ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882"} Apr 23 14:02:47.397881 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:47.397618 2576 scope.go:117] "RemoveContainer" containerID="ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882" Apr 23 14:02:47.854489 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:47.854459 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xtcmf_must-gather-v5cqz_3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb/gather/0.log" Apr 23 14:02:51.604958 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:51.604926 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_global-pull-secret-syncer-x546v_23dffbb8-70d5-4737-8b86-aa438dd71cff/global-pull-secret-syncer/0.log" Apr 23 14:02:51.716348 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:51.716301 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_konnectivity-agent-wr4q4_049aee3e-d268-4368-b645-787f7d1e1152/konnectivity-agent/0.log" Apr 23 14:02:51.797899 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:51.797878 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kube-system_kube-apiserver-proxy-ip-10-0-135-229.ec2.internal_5ea2aeb3a492354a2bfeec0a963ac187/haproxy/0.log" Apr 23 14:02:53.354959 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.354925 2576 kubelet.go:2553] "SyncLoop DELETE" source="api" pods=["openshift-must-gather-xtcmf/must-gather-v5cqz"] Apr 23 14:02:53.355353 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.355145 2576 kuberuntime_container.go:864] "Killing container with a grace period" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="copy" containerID="cri-o://27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79" gracePeriod=2 Apr 23 14:02:53.360095 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.360044 2576 status_manager.go:895] "Failed to get status for pod" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" err="pods \"must-gather-v5cqz\" is forbidden: User \"system:node:ip-10-0-135-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xtcmf\": no relationship found between node 'ip-10-0-135-229.ec2.internal' and this object" Apr 23 14:02:53.360621 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.360600 2576 kubelet.go:2547] "SyncLoop REMOVE" source="api" pods=["openshift-must-gather-xtcmf/must-gather-v5cqz"] Apr 23 14:02:53.580812 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.580794 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xtcmf_must-gather-v5cqz_3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb/copy/0.log" Apr 23 14:02:53.581147 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.581132 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:53.583540 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.583513 2576 status_manager.go:895] "Failed to get status for pod" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" err="pods \"must-gather-v5cqz\" is forbidden: User \"system:node:ip-10-0-135-229.ec2.internal\" cannot get resource \"pods\" in API group \"\" in the namespace \"openshift-must-gather-xtcmf\": no relationship found between node 'ip-10-0-135-229.ec2.internal' and this object" Apr 23 14:02:53.716580 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.716463 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-must-gather-output\") pod \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " Apr 23 14:02:53.716690 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.716599 2576 reconciler_common.go:162] "operationExecutor.UnmountVolume started for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-kube-api-access-lkc4z\") pod \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\" (UID: \"3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb\") " Apr 23 14:02:53.718739 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.718716 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/projected/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-kube-api-access-lkc4z" (OuterVolumeSpecName: "kube-api-access-lkc4z") pod "3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" (UID: "3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb"). InnerVolumeSpecName "kube-api-access-lkc4z". PluginName "kubernetes.io/projected", VolumeGIDValue "" Apr 23 14:02:53.722247 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.722220 2576 operation_generator.go:781] UnmountVolume.TearDown succeeded for volume "kubernetes.io/empty-dir/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-must-gather-output" (OuterVolumeSpecName: "must-gather-output") pod "3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" (UID: "3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb"). InnerVolumeSpecName "must-gather-output". PluginName "kubernetes.io/empty-dir", VolumeGIDValue "" Apr 23 14:02:53.817735 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.817712 2576 reconciler_common.go:299] "Volume detached for volume \"must-gather-output\" (UniqueName: \"kubernetes.io/empty-dir/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-must-gather-output\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 14:02:53.817735 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.817735 2576 reconciler_common.go:299] "Volume detached for volume \"kube-api-access-lkc4z\" (UniqueName: \"kubernetes.io/projected/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb-kube-api-access-lkc4z\") on node \"ip-10-0-135-229.ec2.internal\" DevicePath \"\"" Apr 23 14:02:53.825230 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:53.825207 2576 kubelet_volumes.go:163] "Cleaned up orphaned pod volumes dir" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" path="/var/lib/kubelet/pods/3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb/volumes" Apr 23 14:02:54.421115 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.421090 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-must-gather-xtcmf_must-gather-v5cqz_3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb/copy/0.log" Apr 23 14:02:54.421493 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.421401 2576 generic.go:358] "Generic (PLEG): container finished" podID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerID="27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79" exitCode=143 Apr 23 14:02:54.421493 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.421456 2576 scope.go:117] "RemoveContainer" containerID="27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79" Apr 23 14:02:54.421493 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.421473 2576 util.go:48] "No ready sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-xtcmf/must-gather-v5cqz" Apr 23 14:02:54.429099 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.429081 2576 scope.go:117] "RemoveContainer" containerID="ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882" Apr 23 14:02:54.442492 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.442391 2576 scope.go:117] "RemoveContainer" containerID="27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79" Apr 23 14:02:54.442667 ip-10-0-135-229 kubenswrapper[2576]: E0423 14:02:54.442647 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79\": container with ID starting with 27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79 not found: ID does not exist" containerID="27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79" Apr 23 14:02:54.442736 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.442675 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79"} err="failed to get container status \"27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79\": rpc error: code = NotFound desc = could not find container \"27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79\": container with ID starting with 27ca2fb2469a4cefbd73bd74dd5cf22d96542db31322c77060ca7237e2909d79 not found: ID does not exist" Apr 23 14:02:54.442780 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.442739 2576 scope.go:117] "RemoveContainer" containerID="ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882" Apr 23 14:02:54.443006 ip-10-0-135-229 kubenswrapper[2576]: E0423 14:02:54.442984 2576 log.go:32] "ContainerStatus from runtime service failed" err="rpc error: code = NotFound desc = could not find container \"ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882\": container with ID starting with ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882 not found: ID does not exist" containerID="ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882" Apr 23 14:02:54.443086 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:54.443014 2576 pod_container_deletor.go:53] "DeleteContainer returned error" containerID={"Type":"cri-o","ID":"ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882"} err="failed to get container status \"ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882\": rpc error: code = NotFound desc = could not find container \"ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882\": container with ID starting with ee1c31adc027aadf0a8b6ada1038587e21459f6ec2272568504d1d08a2b8a882 not found: ID does not exist" Apr 23 14:02:55.575145 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:55.575114 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_authorino-68bd676465-gkm9l_8436e6df-a753-453a-8e1c-829c2637e784/authorino/0.log" Apr 23 14:02:55.685428 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:55.685402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kuadrant-system_kuadrant-console-plugin-6c886788f8-shkqn_a5ddb56b-5f4c-4c85-a105-94515bb001c1/kuadrant-console-plugin/0.log" Apr 23 14:02:56.796710 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.796683 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/alertmanager/0.log" Apr 23 14:02:56.821702 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.821682 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/config-reloader/0.log" Apr 23 14:02:56.846632 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.846598 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/kube-rbac-proxy-web/0.log" Apr 23 14:02:56.870744 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.870723 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/kube-rbac-proxy/0.log" Apr 23 14:02:56.896027 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.896006 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/kube-rbac-proxy-metric/0.log" Apr 23 14:02:56.922297 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.922282 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/prom-label-proxy/0.log" Apr 23 14:02:56.957040 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.957023 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_alertmanager-main-0_1e3c36ce-c4ae-4013-80e3-e50f781cccbb/init-config-reloader/0.log" Apr 23 14:02:56.997901 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:56.997883 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_cluster-monitoring-operator-75587bd455-kh6vf_12ced166-dcdf-4e6f-9ff5-77d972bf0902/cluster-monitoring-operator/0.log" Apr 23 14:02:57.115708 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:57.115655 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_metrics-server-7f65b95986-th9fz_fc39319f-3709-42ba-822c-fe086f71c769/metrics-server/0.log" Apr 23 14:02:57.378649 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:57.378599 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pcpnf_00adbaa1-560d-44f6-bc21-2fbd5b8b655e/node-exporter/0.log" Apr 23 14:02:57.408052 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:57.408037 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pcpnf_00adbaa1-560d-44f6-bc21-2fbd5b8b655e/kube-rbac-proxy/0.log" Apr 23 14:02:57.438753 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:57.438731 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_node-exporter-pcpnf_00adbaa1-560d-44f6-bc21-2fbd5b8b655e/init-textfile/0.log" Apr 23 14:02:57.876519 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:57.876495 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-w58g9_0f358a06-d61f-4f99-8778-a8ede80db2a5/prometheus-operator/0.log" Apr 23 14:02:57.928265 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:57.928244 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_prometheus-operator-5676c8c784-w58g9_0f358a06-d61f-4f99-8778-a8ede80db2a5/kube-rbac-proxy/0.log" Apr 23 14:02:58.021646 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:58.021623 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59b6ff9b68-xmmxk_40352799-93ad-4426-a7dc-d5487b70dc40/telemeter-client/0.log" Apr 23 14:02:58.076132 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:58.076112 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59b6ff9b68-xmmxk_40352799-93ad-4426-a7dc-d5487b70dc40/reload/0.log" Apr 23 14:02:58.104566 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:02:58.104542 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-monitoring_telemeter-client-59b6ff9b68-xmmxk_40352799-93ad-4426-a7dc-d5487b70dc40/kube-rbac-proxy/0.log" Apr 23 14:03:00.101700 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.101658 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/2.log" Apr 23 14:03:00.107675 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.107643 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console-operator_console-operator-9d4b6777b-8wz65_bbd5d80a-4c64-4b49-a070-1d5161e7afc9/console-operator/3.log" Apr 23 14:03:00.557921 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.557901 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_console-59cf858f47-w8wxg_66fad32d-5822-402d-87d1-dc8fe7cae6be/console/0.log" Apr 23 14:03:00.587838 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.587813 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-console_downloads-6bcc868b7-rtxwf_42df0b03-5714-4dda-b296-740fe29fb69f/download-server/0.log" Apr 23 14:03:00.715626 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.715605 2576 kubelet.go:2537] "SyncLoop ADD" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc"] Apr 23 14:03:00.715922 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.715911 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="gather" Apr 23 14:03:00.715970 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.715924 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="gather" Apr 23 14:03:00.715970 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.715937 2576 cpu_manager.go:401] "RemoveStaleState: containerMap: removing container" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="copy" Apr 23 14:03:00.715970 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.715943 2576 state_mem.go:107] "Deleted CPUSet assignment" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="copy" Apr 23 14:03:00.716065 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.716002 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="gather" Apr 23 14:03:00.716065 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.716012 2576 memory_manager.go:356] "RemoveStaleState removing state" podUID="3c79e9f6-d13d-4edd-8ee1-565c3b17e5bb" containerName="copy" Apr 23 14:03:00.723467 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.723451 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.726062 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.726043 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s9qb\"/\"openshift-service-ca.crt\"" Apr 23 14:03:00.727233 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.727212 2576 reflector.go:430] "Caches populated" type="*v1.ConfigMap" reflector="object-\"openshift-must-gather-7s9qb\"/\"kube-root-ca.crt\"" Apr 23 14:03:00.727391 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.727233 2576 reflector.go:430] "Caches populated" type="*v1.Secret" reflector="object-\"openshift-must-gather-7s9qb\"/\"default-dockercfg-b5cf6\"" Apr 23 14:03:00.729353 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.729318 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc"] Apr 23 14:03:00.868291 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.868232 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-sys\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.868291 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.868264 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"kube-api-access-k4pxt\" (UniqueName: \"kubernetes.io/projected/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-kube-api-access-k4pxt\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.868466 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.868291 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-podres\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.868466 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.868356 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-proc\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.868466 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.868398 2576 reconciler_common.go:251] "operationExecutor.VerifyControllerAttachedVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-lib-modules\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969528 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969501 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-sys\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969624 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969535 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"kube-api-access-k4pxt\" (UniqueName: \"kubernetes.io/projected/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-kube-api-access-k4pxt\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969624 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969553 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-podres\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969624 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969573 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-proc\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969624 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969612 2576 reconciler_common.go:224] "operationExecutor.MountVolume started for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-lib-modules\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969624 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969616 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"sys\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-sys\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969872 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969685 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"podres\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-podres\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969872 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969716 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"proc\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-proc\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.969872 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.969726 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"lib-modules\" (UniqueName: \"kubernetes.io/host-path/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-lib-modules\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:00.978833 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:00.978806 2576 operation_generator.go:615] "MountVolume.SetUp succeeded for volume \"kube-api-access-k4pxt\" (UniqueName: \"kubernetes.io/projected/c37d9d6e-13ee-41b8-8afa-e20fcde644a8-kube-api-access-k4pxt\") pod \"perf-node-gather-daemonset-npfwc\" (UID: \"c37d9d6e-13ee-41b8-8afa-e20fcde644a8\") " pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:01.034472 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.034450 2576 util.go:30] "No sandbox for pod can be found. Need to start a new one" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:01.062780 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.062755 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-cluster-storage-operator_volume-data-source-validator-7c6cbb6c87-cbx2l_60cc1ca9-d0ba-4394-a632-3383e1572f9d/volume-data-source-validator/0.log" Apr 23 14:03:01.362198 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.362175 2576 kubelet.go:2544] "SyncLoop UPDATE" source="api" pods=["openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc"] Apr 23 14:03:01.364290 ip-10-0-135-229 kubenswrapper[2576]: W0423 14:03:01.364267 2576 manager.go:1169] Failed to process watch event {EventType:0 Name:/kubepods.slice/kubepods-podc37d9d6e_13ee_41b8_8afa_e20fcde644a8.slice/crio-f6024245b20640b578a46511f8867fba51a474ea9d83e63eaadbc34008ed2642 WatchSource:0}: Error finding container f6024245b20640b578a46511f8867fba51a474ea9d83e63eaadbc34008ed2642: Status 404 returned error can't find the container with id f6024245b20640b578a46511f8867fba51a474ea9d83e63eaadbc34008ed2642 Apr 23 14:03:01.448112 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.448087 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" event={"ID":"c37d9d6e-13ee-41b8-8afa-e20fcde644a8","Type":"ContainerStarted","Data":"a8ae7e7664775f2013b0264de869fa54164d60056742f0a418cf2641e9ebcedf"} Apr 23 14:03:01.448233 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.448120 2576 kubelet.go:2569] "SyncLoop (PLEG): event for pod" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" event={"ID":"c37d9d6e-13ee-41b8-8afa-e20fcde644a8","Type":"ContainerStarted","Data":"f6024245b20640b578a46511f8867fba51a474ea9d83e63eaadbc34008ed2642"} Apr 23 14:03:01.448233 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.448190 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="not ready" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:01.468132 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.468084 2576 pod_startup_latency_tracker.go:104] "Observed pod startup duration" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" podStartSLOduration=1.468070166 podStartE2EDuration="1.468070166s" podCreationTimestamp="2026-04-23 14:03:00 +0000 UTC" firstStartedPulling="0001-01-01 00:00:00 +0000 UTC" lastFinishedPulling="0001-01-01 00:00:00 +0000 UTC" observedRunningTime="2026-04-23 14:03:01.466754402 +0000 UTC m=+1902.214756362" watchObservedRunningTime="2026-04-23 14:03:01.468070166 +0000 UTC m=+1902.216072122" Apr 23 14:03:01.844007 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.843980 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-678lb_fa0283a7-9d03-45eb-8654-fca71445e53e/dns/0.log" Apr 23 14:03:01.871715 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:01.871690 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_dns-default-678lb_fa0283a7-9d03-45eb-8654-fca71445e53e/kube-rbac-proxy/0.log" Apr 23 14:03:02.144772 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:02.144706 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-dns_node-resolver-89qsg_8e915a91-9dcb-4454-9ac4-0012727f6bdd/dns-node-resolver/0.log" Apr 23 14:03:02.647057 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:02.647025 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_image-registry-69f68b99b9-xc5cv_d51efffd-4a46-43d3-a25d-88497f1ec487/registry/0.log" Apr 23 14:03:02.731348 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:02.731300 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-image-registry_node-ca-7g8xg_fccbedc6-6cb0-47bb-8b72-95f91484d090/node-ca/0.log" Apr 23 14:03:03.657593 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:03.657565 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress_router-default-7cf45c9b8b-qfgth_de51ff65-d9f2-40df-b830-2cd1d95fe71e/router/0.log" Apr 23 14:03:04.131684 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:04.131660 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ingress-canary_ingress-canary-cq9bf_c01c5eac-4623-474c-9b1f-de78e668fd57/serve-healthcheck-canary/0.log" Apr 23 14:03:04.668044 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:04.668012 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4dwkv_433dd8e3-1838-432a-bf89-c07ffb6fef04/kube-rbac-proxy/0.log" Apr 23 14:03:04.689688 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:04.689664 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4dwkv_433dd8e3-1838-432a-bf89-c07ffb6fef04/exporter/0.log" Apr 23 14:03:04.710539 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:04.710517 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-insights_insights-runtime-extractor-4dwkv_433dd8e3-1838-432a-bf89-c07ffb6fef04/extractor/0.log" Apr 23 14:03:07.466505 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:07.466477 2576 kubelet.go:2658] "SyncLoop (probe)" probe="readiness" status="ready" pod="openshift-must-gather-7s9qb/perf-node-gather-daemonset-npfwc" Apr 23 14:03:07.731950 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:07.731880 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-lws-operator_lws-controller-manager-868f457486-sm2xf_760155eb-85e1-4817-84fd-7dcc3f8f3c54/manager/0.log" Apr 23 14:03:08.630666 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:08.630633 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_odh-model-controller-696fc77849-7dz84_e9584647-a6a8-48b8-8e90-ef64387c7b72/manager/0.log" Apr 23 14:03:08.651782 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:08.651761 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_s3-init-ljq5v_d32521c3-bfb8-4bef-82ea-fa15b572a3a7/s3-init/0.log" Apr 23 14:03:08.681229 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:08.681206 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/kserve_seaweedfs-86cc847c5c-nxr6l_2f643ad5-5f62-4432-bb1f-e8cdee3a5040/seaweedfs/0.log" Apr 23 14:03:15.681428 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:15.681402 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/kube-multus-additional-cni-plugins/0.log" Apr 23 14:03:15.743801 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:15.743773 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/egress-router-binary-copy/0.log" Apr 23 14:03:15.805228 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:15.805204 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/cni-plugins/0.log" Apr 23 14:03:15.853216 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:15.853187 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/bond-cni-plugin/0.log" Apr 23 14:03:15.900683 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:15.900665 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/routeoverride-cni/0.log" Apr 23 14:03:15.972386 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:15.972314 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/whereabouts-cni-bincopy/0.log" Apr 23 14:03:16.008579 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:16.008555 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-additional-cni-plugins-b6xwg_58b92c1d-fc85-4d19-82d4-79f878c270ce/whereabouts-cni/0.log" Apr 23 14:03:16.611868 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:16.611838 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_multus-bltjj_10025a80-efb0-4838-a4b5-8e9ea110d4e1/kube-multus/0.log" Apr 23 14:03:16.845858 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:16.845828 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ms5b6_6bc685f5-9bf7-4830-a9ad-e4622169dcdb/network-metrics-daemon/0.log" Apr 23 14:03:16.866842 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:16.866778 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-multus_network-metrics-daemon-ms5b6_6bc685f5-9bf7-4830-a9ad-e4622169dcdb/kube-rbac-proxy/0.log" Apr 23 14:03:17.803826 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:17.803798 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/ovn-controller/0.log" Apr 23 14:03:17.835440 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:17.835413 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/ovn-acl-logging/0.log" Apr 23 14:03:17.860107 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:17.860076 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/kube-rbac-proxy-node/0.log" Apr 23 14:03:17.887970 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:17.887947 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/kube-rbac-proxy-ovn-metrics/0.log" Apr 23 14:03:17.927437 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:17.927417 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/northd/0.log" Apr 23 14:03:17.967255 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:17.967233 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/nbdb/0.log" Apr 23 14:03:18.004070 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:18.004050 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/sbdb/0.log" Apr 23 14:03:18.118291 ip-10-0-135-229 kubenswrapper[2576]: I0423 14:03:18.118211 2576 log.go:25] "Finished parsing log file" path="/var/log/pods/openshift-ovn-kubernetes_ovnkube-node-6mgfk_8df3caba-2d71-4077-8a19-92dfab41c079/ovnkube-controller/0.log"